- In Collaboration with all relevant stakeholders (e.g., Chapter & Delivery Lead, Business, Data Architects) and based on predefined business & technical requirements, you develop our Data Warehouse and Data Lake of today and tomorrow.
- Therefore, you take end-2-end responsibility from taking over the business requirements for a specific use case/ user story, translating them into technical solution designs (together with the Data Architects) and developing the technical end-2-end solution (incl. ETL-Jobs, if necessary, technical coding, testing, deployment & release).
- If necessary, you can help yourself in solving technical topics by programming well-fitting technical modules with e.g., Python, Scala a/o C#, e.g. for user defined functions etc.
- You develop, update and take accountability on our development guidelines (e.g., Data Lake, ADF, DWH) not only for SQL Data Warehouse Developers but also for Low Code DWH Developers and align these guidelines with all stakeholders (e.g., Data Governor, Data & BI Engineers in Business & IT).
- You take a guiding role for other data engineers (internal & external) by being first contact e.g., for the usage of the data warehouse automation tool, for data modelling topics (Data Vault 2.0) and thus ensure high quality outcome of the data engineering community overall.
- You build up and moderate communities of practice for (citizen) data engineers and thus train them on our Cloud DWH technologies, automation tools & modelling techniques.
- The basis of your success is a degree in Information Technology, Mathematics, Physics, Business Administration or equivalent with at least +5 years’ experience - Gained Knowledge in Finance and Banking Industry is a big plus.
- You are an enthusiast on data, data warehousing and engineering and you have gained +3 years of experience as a Data Warehousing Developer and/or Data Engineer .
- You have a proven track record in Cloud Data Warehouse Technologies (ideally Snowflake on Azure), and you are familiar with the Azure Eco-System, such as Azure Data Lake Gen2, Delta Lake, Azure Data Factory and Azure Functions.
- You are an expert in Data Modelling with Data Vault 2.0 and have worked at least with one of the following Data Warehouse Automation Tools, such as dbt/ dbt cloud, WhereScape, vaultspeed a/o datavault builder.
- You are familiar with different Data Warehouse Architectures (Multiple Layer-, Lamda-, Kappa-) and Data Processing Techniques (Load Design Patterns, CDC, Stream, Batch, etc.).
- You have expert programming expertise in SQL.
- You have expert knowledge in one of the following programming languages such as Python, Scala, Java, C++, C# and you are familiar with design patterns and object-oriented thinking.
- You have deep knowledge in CI/CD Pipelines, Azure DevOps, Git, Github a/o GitBash.
- You are highly motivated and convince through your hands-on & can-do mentality, your flexibility, and proactive support. Besides that, you believe in agility and you like to work with KANBAN, SCRUM a/o SaFe.
- Further, you can communicate technical complex problems easily to technical and non-technical listeners and are not afraid of presenting from time to time in front of a group.
- You see functional leadership as one possible next step in your carrier.
- English is necessary, German would be a plus.
- Experiences with Low Code/ Citizen DWH Developers would be a plus.
- A proven project experience in migrating on-premises data warehouses such as SAP BW / SAP HANA, Oracle, Microsoft SQL Server to the Cloud (ideally Snowflake on Azure) would be a great plus.
Organization: Siemens Financial Services
Company: Siemens S.A.
Experience Level: Experienced Professional
Job Type: Full-time