Summary Posted: Aug 30, 2023 Role Number: 200500000 One of Apple’s R&D groups is currently seeking a senior perception sensor algorithm development engineer. This engineer will develop algorithms, models, visualizations and production firmware that enables intelligent perception features. Candidate will lead algorithm architecture, prototyping and development related to geometric signal processing, sensor calibration, world representation, tracking and classification. Typical work products include models, algorithms, and realtime software punctuated with leadership demos, presentations and software releases. Key Qualifications Key Qualifications Skilled in the development, theory and design of full stack perception: from modeling and simulation of algorithms to implementation on realtime target hardware. Deep expertise in object detection, geometric processing, extent and occlusion reasoning and tracking of dynamic agents. Experience with lidar based perception Experience with calibration of intrinsic and extrinsic parameters, plane fitting, point-to-plane residual minimization. Capable of implementing as batch least squares or similar optimization methods. Experience with object tracking such as Kalman filters, ARMAs, observers, complementary filters, and model based predictors Fundamental understanding of optics, ray tracing, and evaluation of sensor data products. Experience building evaluation frameworks for sensitivity studies and analyzing the resulting large data sets Experience with 3d trajectory generation (equations of motion) and applicability to depth sensors Extensive experience in scalable scientific computing (Matlab & Python) with capability to implement in C/C++ Comfortable with source control (GIT) on multi developer projects Description Description In this role you will partner with other sensor and algorithm subject matter experts to lead the development of a scalable sensor based perception system. You will prototype, implement, validate and deliver world class perception algorithms.
Develop new features and validate performance in the real world
Work with software quality engineers, human interface, and user studies groups to develop key features on a unique Apple product. Education & Experience Education & Experience BS or MS in Engineering plus 5-10 years industry experience Additional Requirements Additional Requirements Nice to Have: Experience on customer facing features Experience on algorithm optimization for compute constrained platforms Experience with ML based approaches to perception Pay & Benefits Pay & Benefits At Apple, base pay is one part of our total compensation package and is determined within a range. This provides the opportunity to progress as you grow and develop within a role. The base pay range for this role is between $170,700 and $300,200, and your base pay will depend on your skills, qualifications, experience, and location. Apple employees also have the opportunity to become an Apple shareholder through participation in Apple’s discretionary employee stock programs. Apple employees are eligible for discretionary restricted stock unit awards, and can purchase Apple stock at a discount if voluntarily participating in Apple’s Employee Stock Purchase Plan. You’ll also receive benefits including: Comprehensive medical and dental coverage, retirement benefits, a range of discounted products and free services, and for formal education related to advancing your career at Apple, reimbursement for certain educational expenses — including tuition. Additionally, this role might be eligible for discretionary bonuses or commission payments as well as relocation. Learn more about Apple Benefits. Note: Apple benefit, compensation and employee stock programs are subject to eligibility requirements and other terms of the applicable plan or program.
View Original Job Posting