- Bangalore - - Inde
Data Engineer
Hello Talented Techie!
Role Summary
Data Engineer will be responsible for designing, building, and maintaining scalable solutions leveraging cloud data platforms that support analytics, reporting, and AI-driven use cases. The role focuses on Snowflake-based data architecture, data transformation using dbt, and advanced analytics using Python, SQL, and Azure OpenAI services.
Job responsibilities.
- Collaborate with business and technical stakeholders to understand requirements and translate them into scalable data solutions using Snowflake and Azure
• Design, develop, and maintain end-to-end data pipelines leveraging Snowflake Streams, Tasks, Snowpipe, Time Travel, and Fail-safe
• Build and manage secure data ingestion frameworks, including Snowflake external stages and storage integrations
• Develop and maintain dbt models using SQL and Python, including macros, Jinja templating, testing, and deployment automation
• Create and optimize analytics-ready data models in Snowflake to support reporting, self-service analytics, and downstream applications
• Implement Snowflake stored procedures, UDFs, and Snowpark (Python) to support complex data transformations and orchestration logic
• Apply performance and cost optimization techniques in Snowflake, including query tuning, warehouse sizing, and clustering strategies
• Design and implement AI-enabled data solutions using Azure OpenAI, including LLM-based workflows and RAG pipelines for structured and unstructured data
• Develop and deploy Python-based services and APIs to expose data and AI capabilities
• Orchestrate batch and near real-time workflows using dbt Cloud and scheduling/orchestration tools
• Ensure data security, governance, and access control using role-based access control (RBAC) and cloud best practices
• Monitor, maintain, and continuously improve platform reliability, scalability, and performance. - Strong experience with Snowflake data warehousing and cloud data architecture
• Hands-on expertise in dbt, including data modeling, macros, testing, and CI/CD workflows
• Advanced proficiency in SQL and Python for data transformation and automation
• Experience building AI-powered solutions using Azure OpenAI and LLM-based architectures
• Solid understanding of cloud-native data engineering principles and security best practices
• Experience with workflow orchestration and production-grade data pipelines
• To experiment with new technology and execute POCs
Create a better #TomorrowWithUs!
We value your unique identity and perspective and are fully committed to providing equitable opportunities and building a workplace that reflects the diversity of society. Come bring your authentic self and create a better tomorrow with us.
This role is based in Bangalore. But you’ll also get to visit other locations in India and globe, so you’ll need to go where this journey takes you. In return, you’ll get the chance to work with teams impacting entire cities, countries and the shape of things to come. We’re Siemens. A collection of over 379,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow.
Find out more about Siemens careers at: www.siemens.com/careers