Master Data Engineer - Instant Ink

Company: HP
Company: HP
Location: Vancouver, Washington, United States of America
Commitment: Full time
Posted on: 2023-05-05 16:14
Master Data Engineer - Instant InkDescription -Does Big Data and Cloud Native Data Lakes/DW get you excited? Does the thought of building sustainable Big Data pipelines on behalf for customer make you smile? How about working with leading edge technologies like Amazon's Redshift or S3 or Databricks and Airflow and Nifi? The data engineering role is a team member that will help enhance and maintain the Instant Ink Business Intelligence system. This role will work with external and internal business partners to capture business requirements and help develop solutions to support the business. You will drive work you're doing to completion with hands-on development responsibilities, and partner with the Data Engineering leaders to provide thought leadership and innovation with those results. This individual will apply developed subject matter knowledge to solve common and complex business issues and recommend appropriate alternatives. The daily work will consist of solving problems that are of diverse complexity and scope, while exercising independent judgment within generally defined policies and practices. This role will require handling unique or unclear requirements and seek advice from team members and leaders to make decisions on complex business issues. Responsibilities• Works with other data engineers and architects to establish secure and performant data architectures, enhancements, updates, and programming changes for portions and subsystems of data platform, repositories or models for structured/unstructured data.• Analyzes design and determines coding, programming, and integration activities required based on general objectives and knowledge of overall architecture of product or solution.• Reviews and evaluates designs and project activities for compliance with architecture, security and quality guidelines and standards; provides tangible feedback to improve product quality and mitigate failure risk.• Writes and executes complete testing plans, protocols, and documentation for assigned portion of data system or component; identifies defects and creates solutions for issues with code and integration into data system architecture.• Collaborates and communicates with project team regarding project progress and issue resolution.• Works with the data engineering team for all phases of larger and more-complex development projects and engages with external users on business and technical requirements.• Collaborates with peers, engineers, data scientists and project team.• Typically interacts with high-level Individual Contributors, Managers and Program Teams on a daily/weekly basis.• Helps define and lead portions of project requirements for data exchanges and business requirements with externals and internal teams• Collaborates with SMEs to develop procedures for collecting, recording, analyzing, and communicating data for review and feedback. What you bring :We are looking for world class talent that brings the following key skills and experience to this role:• Bachelor's or Master's degree in Computer Science, Information Systems, Engineering or equivalent.• 10+ years of relevant experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools.• 4+ years of experience with Cloud based DW such as Redshift, Snowflake etc.• 4+ years’ experience in Big Data Distributed ecosystems (Hadoop, SPARK, Hive & Delta Lake)• 4+ years’ experience in Big Data Distributed systems such as Databricks, AWS EMR, AWS Glue etc. • Leverage monitoring tools/frameworks, like Splunk, CloudWatch etc.• Experience with container management frameworks such as Docker, Kubernetes, ECR etc.• 4+ year’s working with multiple Big Data file formats (Parquet, Avro, Delta Lake)• Experience working on CI/CD processes such as Jenkins, Codeway etc. and source control tools such as GitHub, etc.• Strong experience in coding languages like Python, Scala & Java Knowledge and Skills• Fluent in relational based systems and writing complex SQL.• Fluent in complex, distributed and massively parallel systems.• Strong analytical and problem-solving skills with ability to represent complex algorithms in software.• Experience in building lambda, kappa, microservice and batch architecture• Designing data systems/solutions to manage complex data.• Strong understanding of database technologies and management systems.• Strong understanding of data structures and algorithms• Database architecture testing methodology, including execution of test plans, debugging, and testing scripts and tools.• Effective communication skills (with team members, the business, and in code)• Ability to effectively communicate product architectures, design proposals and negotiate options at management levels.• Has a passion for data solutions and willing to pick up new programming languages, technologies, and frameworks• Strong analytical and problem-solving skills. Nice to Have• Experience with transformation tools such as dbt.• Have experience in building realtime streaming data pipelines• Experience in pub/sub streaming technologies like Kafka, Kinesis, Spark Streaming etc.HP offers a comprehensive benefits package, including:•  Dental insurance•  Disability insurance•  Employee assistance program•  Flexible schedule•  Flexible spending account•  Health insurance•  Life insurance Per the Washington statute, the estimated range of compensation for this job in Vancouver, at the time of this posting, is $148,500 to $181,500.  This position may be eligible for incentive pay, for openings where this is applicable.#LI-POSTJob -SoftwareSchedule -Full timeShift -No shift premium (United States of America)Travel -NoRelocation -YesEEO Tagline - HP Inc. is EEO F/M/Protected Veteran/ Individual with Disabilities.
View Original Job Posting