You have successfully copied the job URL to clipboard!
Employment Type:
Location:
Job Category:
Job Number:
Job Description
Join us in the Procurement Execution Center (PEC) as a Data Engineer as part of a is a diverse team of data and procurement individuals. In this role, you will be responsible for deploying supporting the E2E management of our data, including: ETL/ELT, DW/DL, data staging, data governance, and manage the different layers of data required to ensure a successful BI & Reporting for the PEC. This role will work with multiple types of data, spreading across multiple functional areas of expertise, including Fleet, MRO & Energy, Travel, Professional Services, among others.
How will you do it?
• Serve as the main technical resource for any data-related requirement
• Demonstrate an ability to communicate technical knowledge through project management and contributions to product strategy
• Deploy data ingestion processes through Azure Data Factory to load data models as required into Azure Synapse.
• Build and design robust, modular and scalable ETL/ELT pipelines with Azure Data Factory, Python and/or dbt.
• Assemble large, complex, robust and modular data sets that meet functional / nonfunctional business requirements.
• Build the infrastructure required for optimal ETL/ELT of data from a wide variety of data sources using Data Lakehouse technologies and ADF.
• Develop data models that enable DataViz, Reporting and Advanced Data Analytics, striving for optimal performance across all data models.
• Maintain conceptual, logical, and physical data models along with corresponding metadata.
• Manages the DevOps pipeline deployment model, including automated testing
procedures
• Deploys data stewardship and data governance across our data warehouse, to cleanse and enhance our data, using knowledge bases and business rules.
• Ensure compliance with system architecture, methods, standards, practices and participate in their creation
• Clearly articulate and effectively influence both business and technical teams
• Performs the necessary data ingestion, cleansing, transformation, and coding of business rules to support annual Procurement bidding activities.
• Support the deployment of a global data standard.
• Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
• Support Rate Repository management as required (including Rate Card uploads to our DW).
• Other Procurement duties as assigned.
What are we looking for?
• Bachelor’s degree in related field (Engineering, Computer Science, Data Science or similar)
• 4+ years of relevant professional experience in BI Engineering, data modeling, data engineering, software engineering or other relevant roles. Strong SQL knowledge and experience working with relational databases.
• Knowledge in DW/DL concepts, data marts, data modeling, ETL/ELT, data
quality/stewardship, distributed systems and metadata management.
• Experience building and optimizing data pipelines, architectures, and data sets.
• Azure Data Engineering certification preferred (DP-203)
• ETL/ELT development experience (4+ years), ADF, dbt and snowflake are preferred.
• Ability to resolve ETL/ELT problems by proposing and implementing
tactical/Strategic solutions.
• Strong project management and Organizational skills.
• Experience with object-oriented function scripting languages: Python, Scala, R, etc.
• Experience with NoSQL databases is a plus to support the transition from On-Prem to Cloud.
• Excellent problem solving, critical thinking, and communication skills
• Relevant experience with Azure DevOps (CI/CD, git/repo management) is a plus
• Due to the global nature of the role, proficiency in English language is a must.