β TALENT ONE MARKET INTELLIGENCE: D-MATRIX
π₯ TALENT LIQUIDITY:
Extremely tight. The intersection of deep firmware expertise and AI inference optimization is a niche with a 1:30 supply-to-demand ratio.
π ALPHA SIGNAL:
This role puts you at the center of the AI hardware stack. By mastering in-memory compute runtime, you become a ‘force multiplier’ for the entire product, securing equity that scales with the company’s hardware adoptionβeffectively insulating your mortgage against the volatility of pure software-only startups.
$220k – $280k Base + Significant Equity Package
88.0/100
Official Role Description: d-Matrix
At d-Matrix, we are focused on unleashing the potential of generative AI to power the transformation of technology. We are at the forefront of software and hardware innovation, pushing the boundaries of what is possible.Β Our culture is one of respect and collaboration.We value humility and believe in direct communication.Β Our team is inclusive, and our differing perspectives allow for better solutions.Β We are seeking individuals passionate about tackling challenges and are driven by execution.Β Ready to come find your playground? Together, we can help shape the endless possibilities of AI.Β Location:Hybrid, working onsite at our Santa Clara, CA, headquarters 3 days per week.The role: Staff Runtime Systems EngineerWhat You Will Do:d-Matrix is developing an AI compute platform focusing on in-memory compute for AI inference in datacenters.This position is for runtime software engineering, working on the architecture, development, and validation of the functionality and efficiency of firmware/software that is executed on multiprocessor system-on-a-chip, low-level drivers, and systems software that hosts this SoC.In this role, you will be largely responsible for all aspects of runtime performance of the silicon product. You will architect, document, and develop the runtime firmware that executes in various on-chip multi-core CPU subsystems.You will be responsible for determining the delivery schedule and ensuring the software meets d-Matrix coding and methodology guidelines. You will collaborate with the hardware team, hardware verification team, and other members of the software teamKey Responsibilities:Design, implement and develop systems software for AI inference infrastructureIdentify, analyze, architect, develop and debug systems softwareExperience in distributed and scale-out applicationDeliver quality code; debug complex problemsQualification:Bachelorβs in computer engineering or electrical engineering with a minimum of 5 years of industry experience in embedded software developmentWorked with containers and VMsExperience with Linux system-level programmingStrong knowledge in Linux device driverProficiency in C/C++Experience working with python is desiredEqual Opportunity Employment Policyd-Matrix is proud to be an equal opportunity workplace and affirmative action employer. Weβre committed to fostering an inclusive environment where everyone feels welcomed and empowered to do their best work. We hire the best talent for our teams, regardless of race, religion, color, age, disability, sex, gender identity, sexual orientation, ancestry, genetic information, marital status, national origin, political affiliation, or veteran status. Our focus is on hiring teammates with humble expertise, kindness, dedication and a willingness to embrace challenges and learn together every day.d-Matrix does not accept resumes or candidate submissions from external agencies. We appreciate the interest and effort of recruitment firms, but we kindly request that individual interested in opportunities with d-Matrix apply directly through our official channels. This approach allows us to streamline our hiring processes and maintain a consistent and fair evaluation of al applicants. Thank you for your understanding and cooperation.