Responsibilities
We are looking for talented individuals to join us for an internship in 2027. PhD Internships at our Company aim to provide students with the opportunity to actively contribute to our products and research, and to the organization's future plans and emerging technologies.
PhD internships at Our Company provides students with the opportunity to actively contribute to our products and research, and to the organization's future plans and emerging technologies. Our dynamic internship experience blends hands-on learning, enriching community-building and development events, and collaboration with industry experts.
Applications will be reviewed on a rolling basis - we encourage you to apply early. Please state your availability clearly in your resume (Start date, End date).
Team Introduction: The Applied Machine Learning (AML) team is committed to the research and deployment of the next-generation of machine learning core technologies. This covers large pretrained models and device-cloud collaboration learning, as well as wide applications in search, recommendation, advertising, auditing, federated learning, and more. The team has a strong foundation in scientific research, engineering and product implementation.
Our team members have rich backgrounds covering natural language processing (NLP), computer vision (CV), multimodality, graph computing, search and recommendation, federated learning and other fields, and have published more than 100 top-tier conference papers.
Topic Content: Large-scale recommendation systems are being increasingly adopted across products such as short-video, text-based community, and image platforms, with modality-specific information playing an ever-growing role in recommendations.
In ByteDance's practice, we have found that modality information serves effectively as generalizable features to support recommendation and other business scenarios. Research on end-to-end ultra-large-scale multimodal recommendation systems holds significant potential.
Building on an algorithm-engineering co-design approach, we aim to further explore directions including multimodal co-training, models with hundreds of billions of parameters, and end-to-end modeling with extended sequence lengths.
On the engineering side, research directions include: multimodal sample representation; high-performance multimodal inference engines built on the PyTorch framework; high-performance multimodal training framework development; and the application of heterogeneous hardware in multimodal recommendation systems.
On the algorithm side, research directions include: designing effective recommendation-and-ads multimodal co-training architectures, sparse MoE, memory networks, and mixed precision.
Topic Challenges
Topic Value
Achieve breakthroughs in multimodal representation fusion and training/inference bottlenecks for ultra-large-scale models; refine the co-design framework for algorithms and engineering; advance heterogeneous hardware adaptation and the development and deployment of domestically developed high-performance frameworks.
Enhance recommendation accuracy and generalization capability in multimodal scenarios; overcome the modality limitations of existing recommendation systems; empower multiple products including short-video and text-based community platforms; reduce computational costs; and drive scalable business growth.
Qualifications
Minimum Qualifications
Preferred Qualifications
INTERN
intern
5/4/2026
You will be redirected to ByteDance's application portal.
Sign in and we'll score your resume against this role.
Browse roles in the same category, level, and remote setup.