About us
Makora is a venture-backed AI lab building building tools to automate algorithm discovery and GPU performance engineering. There are two core components:
- MakoraGenerate writes GPU kernels in CUDA, HIP, and Triton using LLMs
- MakoraOptimize automatically selects and swaps GPU kernels in combination with tuning inference engine (vLLM, SGlang, etc..) hyperparameters to optimize performance
We are located in New York City and GdaĆsk.
Currently open job positions
General Application
GPU Kernel Engineer
AI Engineer
AI Engineer Intern
Forward Deployed Engineer
To Apply fill out the form below
Fill out this form