Summary Posted: Oct 4, 2023 Role Number: 200508674 The 3D reconstruction team is hiring a Software Engineer to help us ship the next generation products. With last year’s release of ARKit, we shipped the worlds largest deployment of 3D meshing based on the new Apple Lidar sensor. We need help to take this to the next level, as our algorithms, systems, and hardware evolve. This role is focused on building and integrating a real-time system of algorithms for 3D reconstruction, including system design and implementation, sensor and platform integration, optimization for power/thermal constraints. The ideal candidate will have some experience building and debugging real-time, robust, fault-tolerant, concurrent systems, and has experience with of 3D computer vision and image processing. Key Qualifications Key Qualifications Proficiency with C++, with experience working in parallel and concurrent systems. Experience programming and debugging real-time multi-threaded systems. Experience with 3D computer vision (e.g., camera projection models, RGBD data, data fusion) Familiarity with inter-process communication and related concepts is a plus (RPC, XPC, shared memory, mmap) is a plus. Familiarity with quality software development processes, like unit testing, test-driven development, code review, and continuous integration and delivery processes is a plus. Experience collaborating cross-functionally with other engineering teams is a plus. Description Description As a Software Engineer on 3D Reconstruction System Algorithms and Architecture team you have the following responsibilities:
Be responsible for the architecture, design, development, and operations of a real-time system for 3D computer vision. These may include, but not limited to sensor integration, integrating and debugging parallel and concurrent compute systems, developing algorithmic system for processing RGBD image data, designing efficient data flows between algorithmic systems (both within-process and between-processes), power/latency/memory optimization, and bring-up of algorithmic systems on new hardware and sensors.
Partner with software and system engineers across different Apple organizations to support efficient integration of 3D compute vision algorithms, and enable adoption of these algorithms by new internal customers. In this role, you will be building the system framework that integrates multiple real-time computer vision algorithms across several teams at Apple. Education & Experience Education & Experience MS or PhD in Computer Science with 1+ years experience, or BS with 3+ years experience. Additional Requirements Additional Requirements Pay & Benefits Pay & Benefits At Apple, base pay is one part of our total compensation package and is determined within a range. This provides the opportunity to progress as you grow and develop within a role. The base pay range for this role is between $116,100 and $208,300 annualized, and your base pay will depend on your skills, qualifications, experience, and location. Apple employees also have the opportunity to become an Apple shareholder through participation in Apple’s discretionary employee stock programs. Apple employees are eligible for discretionary restricted stock unit awards, and can purchase Apple stock at a discount if voluntarily participating in Apple’s Employee Stock Purchase Plan. You’ll also receive benefits including: Comprehensive medical and dental coverage, retirement benefits, a range of discounted products and free services, and for formal education related to advancing your career at Apple, reimbursement for certain educational expenses — including tuition. Additionally, this role might be eligible for discretionary bonuses or commission payments as well as relocation. Learn more about Apple Benefits. Note: Apple benefit, compensation and employee stock programs are subject to eligibility requirements and other terms of the applicable plan or program.
View Original Job Posting