Android Engineering at Snap: Camera, AR, and On-Device ML
In short
Android engineering at Snap is built around Snapchat, a camera-first app where the camera, AR Lenses, and messaging are the product, not a feature. Android engineers there work close to a real-time graphics and ML pipeline: Camera2 / CameraX feeding GL and Vulkan rendering, Lens Core driving AR effects authored in Lens Studio, and SnapML running on-device inference for segmentation, tracking, and effects. The same Lens platform that powers Snapchat Android also powers Spectacles, Snap's AR glasses, so platform Android work has a hardware tie-in. Interviews emphasize Android fundamentals plus graphics, performance, or ML depending on team. Levels.fyi puts senior Android total comp in the mid-$300Ks to high-$400Ks range, with staff well above. Snap's engineering blog and ar.snap.com are the most useful public sources; team-by-team org structure is not fully published.
Key takeaways
- Snapchat is camera-first: the camera and AR Lenses are the product, not a feature.
- Android engineers work close to a real-time graphics and ML pipeline, not just UI plumbing.
- Lens Core powers AR effects authored in Lens Studio; the same platform runs on Spectacles.
- SnapML runs ML inference on-device for segmentation, tracking, and effects, with tight latency budgets.
- Camera capture uses Camera2 / CameraX, with GL and Vulkan rendering paths for effects.
- Interview loop covers Android fundamentals, systems, and a graphics or ML round depending on team.
- Senior Android (L4) total comp is commonly mid-$300Ks to high-$400Ks per Levels.fyi self-reports.
- Snap publishes good detail on AR and ML platforms; per-team Android org structure is less public.
Android engineer at Snap in 2026: camera + AR
Snapchat is one of the few mainstream Android apps where the camera is the home screen. Open the app and you are looking through a viewfinder with AR Lenses one swipe away; messaging, Stories, Spotlight, and Map are organized around that capture surface rather than competing with it. For an Android engineer, that product shape changes the work. Most Android roles at consumer companies are heavy on RecyclerViews, networking, and feed rendering. At Snap, those exist, but a meaningful slice of the engineering org sits closer to the camera, the GPU, and the ML pipeline that turns frames into AR.
Concretely, Android teams at Snap split roughly into three families. Camera and capture teams own the path from sensor to encoder, including Camera2 and CameraX integration, frame timing, exposure and focus control, HDR pipelines, and the bridge into the rendering layer. Lens and AR teams own Lens Core on Android, the runtime that loads Lenses authored in Lens Studio and drives them across capture and preview. App platform teams own the rest: messaging, the Map, Spotlight, payments, growth infrastructure, and the Kotlin and Jetpack stack the rest of the app uses. Camera and AR work tends to be heavier on C++, JNI, and graphics; app platform work tends to look more like a modern Kotlin and Compose codebase, with Snap's own internal frameworks layered in.
The Spectacles tie-in matters even for engineers who do not work on the glasses directly. Spectacles run the same Lens platform, so improvements to Lens Core, SnapML inference paths, or tracking quality on Snapchat Android often flow into Spectacles, and vice versa. Snap publishes developer docs for Spectacles at ar.snap.com, and several Android-side platform improvements have shown up first in Lens Studio release notes before reaching the consumer Snapchat app. Candidates who care about AR as a long-arc bet, rather than as a feature, find that alignment unusually strong at Snap.
Day-to-day, the work also has a strong device-tier dimension that is unusual for an iOS-first company. Snapchat is a global product with a long tail of mid-range and budget Android devices in markets where Snap has meaningful share. That means a Lens or capture engineer is not optimizing for a single flagship: they are reasoning about thermal throttling on a Snapdragon 6-series chip, GL driver quirks on older Mali GPUs, and degradation paths when a Lens cannot hit its frame budget. App platform engineers feel this too, in startup time, memory headroom, and the size of the APK delivered through Play. The shorthand inside Android communities is that Snap is one of the few large consumer apps where graphics, ML, and device-tier engineering all matter at the same time, which is the part of the job that does not show up in a careers-site bullet.
Interview process
Snap's Android interview loop is a recognizable big-tech shape with a twist for the camera and AR roles. The recruiter screen establishes role, level, and team area. A technical phone screen follows, typically a coding round in Kotlin or Java with a problem that has a real Android flavor: lifecycle, threading, or a small data-shape problem you solve in a shared editor.
The onsite is usually four to five rounds:
- Coding: one or two rounds of medium-to-hard algorithm and data-structure work. Patterns are standard interview material; the bar is correctness, clarity, and complexity reasoning.
- Android fundamentals: lifecycle, process death, threading, memory, configuration changes, Jetpack components, and Compose-versus-View trade-offs. Expect questions about ANRs, jank, and how you would diagnose a frame-time regression on a real device.
- Systems / app design: design a feature end-to-end, for example an offline-first messaging surface, a capture pipeline with background upload, or a Lens browser with caching. Snap interviewers push on failure modes, retries, and battery and data cost.
- Domain round (camera, graphics, or ML): for camera-adjacent or AR roles, expect a round on Camera2, GL or Vulkan, shader basics, color space, or on-device ML inference and quantization. For Lens platform roles, expect deeper graphics and C++ questions.
- Behavioral and values: ownership, dealing with ambiguity, cross-team collaboration, and how you handle production incidents.
Two notes on the loop. First, Snap is not coy about the camera and AR specialization: if you are interviewing for a Lens or capture role, the domain round is real and you should prepare it the way an iOS candidate would prepare Metal. Second, app platform roles tend to drop the graphics round in favor of a second systems or product-thinking round, so the loop shape varies by team.
Compensation by level
Snap is a public company (NYSE: SNAP), so equity is denominated in real RSUs and the comp picture on Levels.fyi has good fidelity. The numbers below are pulled from Levels.fyi self-reported offers and refreshes for software engineers, with Android falling on the same ladder; they are directional and should be validated with a recruiter.
- L2 / SWE II (entry): roughly $180K-$230K total, weighted toward base and sign-on with a modest RSU grant.
- L3 / SWE III (mid): roughly $240K-$320K total, with RSUs becoming a larger share.
- L4 / Senior SWE: roughly $340K-$470K total, with equity typically the dominant component.
- L5 / Staff SWE: roughly $500K-$700K+ total in recent self-reports.
- L6 / Senior Staff and above: thinner sample sizes, but self-reports cluster well above $700K total.
Two caveats. First, Snap RSUs are real public-market shares, so total comp moves with the share price between offer and vest; numbers above reflect grant-date values. Second, Android-specific medians on Levels.fyi are thinner than overall SWE medians, so individual data points can swing the published ranges. The best signal is to ask a recruiter for the level band in writing and to compare against the cross-company levels.fyi/t/software-engineer benchmarks.
Tech stack: native Android + Lens Studio + on-device ML
Snap's Android stack is best understood in three layers: a modern app platform layer in Kotlin, a graphics and capture layer that bridges into C++ and the GPU, and an ML and AR runtime that loads Lenses authored outside the app. The picture below stitches together public references from eng.snap.com, ar.snap.com, and Lens Studio documentation.
- App platform: Kotlin is the primary language for app platform code, with Jetpack components, coroutines, and a mix of legacy Views and newer Compose surfaces depending on team and recency. Snap has its own internal framework conventions layered on top, so candidates should expect a learning curve on first-party libraries that are not always one-to-one with public Jetpack patterns.
- Camera and capture:
Camera2is the underlying API, withCameraXused selectively. Capture pipelines hand frames to a rendering layer that uses OpenGL ES and Vulkan depending on device and effect. Color management, HDR, and frame pacing are first-class concerns because the camera is the product. - AR and Lens runtime: Lens Core is the cross-platform runtime that loads Lenses authored in Lens Studio. On Android it runs as native code over JNI, with the app providing camera frames, sensor data, and rendering surfaces. The same runtime ships in Spectacles, which is why platform improvements have leverage across products.
- On-device ML (SnapML): SnapML is Snap's framework for running ML inside Lenses, on-device. It is used for segmentation, tracking, classification, and generative effects, with models authored or imported through Lens Studio. Constraints are real: models have to fit a Lens budget, run inside a frame at interactive frame rates, and degrade gracefully across device tiers.
- Networking and storage: standard mobile concerns (media upload, retries, offline behavior) with internal libraries; messaging in particular has its own end-to-end design that Snap discusses in pieces rather than as a single architecture post.
- Build, CI, and release: Gradle with internal plugins and modularization, internal CI, and staged release through the Play Store; specifics are not fully public.
For an Android candidate, the practical implication is that graphics and ML are not optional electives at Snap. Even app platform engineers brush against the camera surface often enough that frame-time and GPU literacy matter, and Lens or capture roles expect real depth.
One distinction worth calling out: SnapML is not the same as running a model in the app. The framework is designed so that Lens authors and ML engineers can ship a model as part of a Lens, and the AR runtime invokes it inline with the rendering pipeline. That means inference latency competes with the frame budget and other Lens work, not with a generous off-screen background-task budget. Models go through quantization, pruning, and graph optimization before they ship, and the tooling around that pipeline (in Lens Studio and on the AR Engineer tracks) is a meaningful part of what the platform org owns. From an Android perspective, this looks closer to embedded ML on a phone than to the typical Android ML use case of running a TensorFlow Lite model in a background coroutine to classify an image.
Frequently asked questions
- Is Snapchat Android written in Kotlin or Java?
- Modern app platform code at Snap is primarily Kotlin, with Jetpack and coroutines. Older modules and graphics-adjacent code include Java and a significant C++ layer over JNI for the AR runtime and capture pipeline.
- What is Lens Core, and do Android engineers work on it?
- Lens Core is Snap's cross-platform AR runtime that loads Lenses authored in Lens Studio. On Android it runs as native code with a JNI bridge. Lens platform Android engineers work directly on it; app platform engineers integrate with it through a higher-level API.
- What is SnapML?
- SnapML is Snap's framework for running ML inference on-device inside Lenses. It is used for segmentation, tracking, classification, and generative effects, with models authored or imported through Lens Studio and deployed inside the AR runtime.
- Does Spectacles share code with the Snapchat Android app?
- Spectacles run the same Lens platform that powers Snapchat AR, so improvements to Lens Core, SnapML, and tracking quality have leverage across products. The host application code is different, but the AR runtime is shared lineage.
- How hard is the Snap Android interview compared to FAANG?
- Comparable in coding difficulty, with a real domain round for camera, AR, or ML roles. Android fundamentals are tested seriously (lifecycle, threading, jank), and systems rounds emphasize capture, media, or messaging design rather than generic distributed-systems trivia.
- What does senior Android total comp look like at Snap?
- Per Levels.fyi self-reports, senior (L4) total comp commonly lands in the mid-$300Ks to high-$400Ks, with staff (L5) ranging into the $500Ks to $700K+. RSUs are public-market Snap shares, so totals move with the share price between grant and vest.
- Is Snap Android remote-friendly?
- Snap has primary engineering hubs in Los Angeles (Santa Monica), Seattle, New York, and London, plus other offices. Remote eligibility varies by team and role; check individual listings on the careers site, since policies have shifted over time.
- Where should I read more before applying?
- Three places. The Snap engineering blog at eng.snap.com for architecture and platform posts. ar.snap.com for the AR and Lens Studio platform, including Spectacles. Levels.fyi for current comp ranges by level, both on the Snap company page and the cross-company software engineer benchmark.
Sources
About the author. Blake Crosley founded ResumeGeni and writes about Android engineering, hiring technology, and ATS optimization. More writing at blakecrosley.com.