Member Of Engineering (Pre-Training)

Details of the offer

In this decade, the world will create artificial intelligence that reaches human level intelligence (and beyond) by combining learning and search. There will only be a small number of companies who will achieve this. Their ability to stack advantages and pull ahead will determine who survives and wins. These companies will move faster than anyone else. They will attract the world's most capable talent. They will be on the forefront of applied research and engineering at scale. They will create powerful economic engines. They will continue to scale their training to larger & more capable models. They will be given the right to raise large amounts of capital along their journey to enable this.
poolside exists to be one of these companies - to build a world where AI will drive the majority of economically valuable work and scientific progress.
We believe that software development will be the first major capability in neural networks that reaches human-level intelligence because it's the domain where we can combine Search and Learning approaches the best.
At poolside we believe our applied research needs to culminate in products that are put in the hands of people. Today we focus on building for a developer-led increasingly AI-assisted world. We believe that current capabilities of AI lead to incredible tooling that can assist developers in their day to day work. We also believe that as we increase the capabilities of our models, we increasingly empower anyone in the world to be able to build software. We envision a future where not 100 million people can build software but 2 billion people can.
We are a remote-first team that sits across Europe and North America and comes together once a month in-person for 3 days and for longer offsites twice a year.
Our R&D and production teams are a combination of more research and more engineering-oriented profiles, however, everyone deeply cares about the quality of the systems we build and has a strong underlying knowledge of software development. We believe that good engineering leads to faster development iterations, which allows us to compound our efforts.
ABOUT THE ROLEYou would be working on our pre-training team focused on building out our distributed training of Large Language Models and major architecture changes. This is a hands-on role where you'll be both programming and implementing LLM architectures (dense & sparse) and distributed training code all the way from data to tensor parallelism, while researching potential optimizations (from basic operations to communication) and new architectures & distributed training strategies. You will have access to thousands of GPUs in this team.
YOUR MISSIONTo train the best foundational models for source code generation in the world in minimum time and with maximum hardware utilization.
RESPONSIBILITIESFollow the latest research on LLMs and source code generation. Propose and evaluate innovations, both in the quality and the efficiency of the training.Do LLM-Ops: babysitting and analyzing the experiments, iterating.Write high-quality Python, Cython, C/C++, Triton, CUDA code.Work in the team: plan future steps, discuss, and always stay in touch.SKILLS & EXPERIENCEExperience with Large Language Models (LLM).Deep knowledge of Transformers is a must.Knowledge/Experience with cutting-edge training tricks.Knowledge/Experience of distributed training.Trained LLMs from scratch.Coded LLMs from scratch.Knowledge of deep learning fundamentals.Strong machine learning and engineering background.Research experience.Author of scientific papers on any of the topics: applied deep learning, LLMs, source code generation, etc. - is a nice to have.Can freely discuss the latest papers and descend to fine details.Is reasonably opinionated.Programming experience.Strong algorithmic skills.Python with PyTorch or Jax.C/C++, CUDA, Triton.Use modern tools and are always looking to improve.Strong critical thinking and ability to question code quality policies when applicable.Prior experience in non-ML programming, especially not in Python - is a nice to have.PROCESSIntro call with one of our Founding Engineers.Technical Interview(s) with one of our Founding Engineers.Team-fit call with Beatriz, our Head of People.Final interview with Eiso, our CTO & Co-Founder.BENEFITSFully remote work & flexible hours.37 days/year of vacation & holidays.Health insurance allowance for you and dependents.Company-provided equipment.Wellbeing, always-be-learning and home office allowances.Frequent team get togethers.
#J-18808-Ljbffr


Nominal Salary: To be agreed

Source: Jobrapido_Ppc

Job Function:

Requirements

Junior Couple: Front Of House/ Reservationist And Field Guide

Hospitality and Outdoor - New Vacancy - Field Guide and Front of House Couple (Junior) 5 star - Luxury Lodge- Madikwe Game Reserve, North West Requirement...


South Africa

Published a month ago

Jr 419950 - Couple : Field Guide & Front Of House / Reservationist

Employer Description A 5* Game Lodge located in Madikwe (Big 5 Maleria Free Reserve), in the North West Province. Job Description FIELD GUIDE : Field Guide /...


South Africa

Published a month ago

Civil Engineer

Job Description We seek a highly skilled and experienced Civil Engineer to join our team in Aggeneys. The ideal candidate will play a key role in overseeing ...


Copper Quail - South Africa

Published a month ago

Mechanical Pv Technician

Mechanical PV TechnicianMarket-related Salary PackageNorthern CapePurposePerform operations and maintenance (O&M) tasks at the facility to achieve optimum pe...


Set Consulting - South Africa

Published a month ago

Built at: 2024-11-08T14:41:28.398Z