Kaan Sancak (original) (raw)

Senior Research Scientist at Meta Meta

I work on machine learning systems, recommendation, and AI infrastructure. My background is in high-performance computing and machine learning, and I am interested in the systems and algorithms that make modern ML efficient, scalable, and useful in practice.

Before Meta, I received my Ph.D. in Computer Science fromGeorgia Tech Georgia Tech, advised byUmit V. Catalyurek, where I focused on high-performance computing, GPU systems, parallel algorithms, and machine learning at scale.

Email Google Scholar GitHub LinkedIn CV

🔬 Research Interests

High-Performance Computing Machine Learning ML Systems & AI Infrastructure Recommendation Systems Parallel & Distributed Computing GPU Computing

💼 Experience

Meta

Meta Full-time · Aug 2024 – present

Senior Research Scientist Meta Recommendation Systems (MRS) Feb 2026 – present

Building next-generation recommendation models and ML infrastructure powering Meta's core ranking and personalization systems at scale.

Research Scientist Ads Ranking & Foundational AI (RAI) Aug 2024 – Feb 2026

Model–infrastructure co-design for billion-scale ad recommendation. Built real-time graph integration improving data freshness from days to minutes, boosted training throughput by 20%, and cut feature storage cost by 3x. Lead contributor to the Ads Graph Foundational Model (GFM).

Meta

Meta Internship · 2023

Research Scientist Intern Ranking & Foundational AI May – Aug 2023

Conducted research on scalable graph-based models for efficient learning without sacrificing quality. Work published at AAAI 2025 and ICLR 2024.

Meta

Meta Internship & Part-time · 2022

Part-time Student Researcher AI Systems HW/SW Co-Design Aug – Dec 2022

Extended caching mechanisms for Meta's ML training platform, substantially reducing redundant data serving computations for key models. See the AI System Co-design project.

Research Scientist Intern AI Systems HW/SW Co-Design May – Aug 2022

Built caching infrastructure for Meta's data ingestion pipelines, eliminating redundant computation across large-scale model training runs. See the AI System Co-design project.

PNNL

Pacific Northwest National Laboratory Internship · 2021

Research Intern HPC & Systems May – Aug 2021

Distributed graph algorithms and high-performance data structures on the SHAD framework.

Facebook

Facebook Internship · 2020

Research Intern AI Systems HW/SW Co-Design May – Aug 2020

Improved Facebook's graph engine performance via novel partitioning — 10% query throughput gain, up to 5x end-to-end speedup. Integrated the engine with Instagram Ads; infrastructure still serves Instagram, Threads, and Facebook.

GSoC

Google Summer of Code / NRNB 2018

Open Source Developer May – Aug 2018

Built collaborative pathway editing tools forcBioPortal for Cancer Genomics (Memorial Sloan Kettering Cancer Center).

IBM

IBM Internship · 2017

Software Engineering Intern Jun – Aug 2017

Cloud data transfer and object recognition apps (IBM Cloud, Python, Kafka).

🎓 Education

Georgia Tech

Georgia Institute of Technology

GPA: 4.00/4.00 · Advisor: Umit V. Catalyurek

Research High-performance computing, machine learning, graph systems and algorithms

Teaching CSE 6230: High-Performance Parallel Computing · CSE 6220: Intro to HPC

Bilkent University

Bilkent University, Turkey

Summa Cum Laude · GPA: 3.82/4.00 · Ranked 4th / 231 engineering students

📖 Selected Publications

Haystack Engineering: Context Engineering for Heterogeneous and Agentic Long-Context Evaluation

M. Li, D. Fu, L. Wang, S. Zhang, H. Zeng, K. Sancak, R. Qiu, H. P. Wang, X. He, X. Bresson, Y. Xia, C. Sun, P. Li

arXiv 2025 Paper

Haystack Engineering: Context Engineering Meets the Long-Context Challenge in LLMs

M. Li, D. Fu, L. Wang, S. Zhang, H. Zeng, K. Sancak, R. Qiu, H. P. Wang, X. He, X. Bresson, Y. Xia, C. Sun, P. Li

NeurIPS 2025 Workshop Paper

Full list on Google Scholar →