Pranav Ajit Nair
Google DeepMind, India
pranavajitnair@google.com
I am a Pre-Doctoral Researcher working at Google DeepMind, India, with Dr. Praneeth Netrapalli, Dr. Arun Suggala and Dr. Prateek Jain. My work involves making LLM inference faster through quantization, speculative decoding, and sparsification. I am also working on speeding up “million-context-attention” through clustering and approximate logit computation.
My interests include but are not limited to i) Making LLM inference faster through next-generation architectures, quantization, sparsification, speculative decoding, KV cache compression, adaptive routing for elastic models, etc., and ii) speeding up LLM pretraining and finetuning through better adapters, novel loss functions, better second-order optimizers and faster checkpointing.