Přeskočit na obsah Přeskočit na zápatí

Data Analyst

ID pozice
487755
Zveřejněno od
09-Pro-2025
Organizace
Global Business Services
Obor
Information Technology
Společnost
Siemens Technology and Services Private Limited
Úroveň zkušeností
S dlouholetou praxí v oboru
Typ pozice
Plný úvazek
Režim práce
Pouze na pracovišti
Druh smlouvy
Trvalý
Lokalita
  • Bengalúru - Karnataka - Indie

Hello eager tech expert!

We are looking for an experienced Data Engineer with strong hands-on expertise in Snowflake, SQL, dbt Cloud, GITLAB and cloud-based data engineering. The ideal candidate will have solid experience in designing, developing, and maintaining data pipelines, data models, and data warehouse solutions across various business domains.



You’ll break new ground by:
  • Design, develop, and maintain scalable data pipelines, data models, and data flows for data integration and data warehousing projects, leveraging Snowflake, SQL, and dbt Cloud.
  • Develop, optimize, and maintain complex SQL queries, stored procedures, and ELT workflows, including performance tuning and data warehouse optimization within Snowflake-based ecosystems.
  • Build, test, deploy and maintain high quality modular data-models that support analytics, reporting, and downstream applications.
  • Implement and manage API integrations in Snowflake, including the use of external functions and Snowflake-native APIs.
  • Conduct data quality checks and implement automated testing strategies using dbt cloud and CI/CD pipelines.
  • Leverage GitLab for version control, code review, collaboration, and CI/CD deployment of dbt projects.
  • Use SnowSQL, Snowpipe, and other Snowflake utilities to support ingestion, automation, and orchestration workflows.
  • Develop and maintain documentation for data models, business logic, and transformation processes.
  • Work with relational and NoSQL data stores, leveraging star/snowflake schema modeling and dimensional design.
  • Ensure data quality, security, and access controls across all developed solutions.
  • Work with cloud technologies such as AWS S3 or Azure ADLS for data storage and ingestion.
  • Collaborate with cross-functional teams to understand business requirements and translate them into scalable technical solutions.
  • Participate in Agile ceremonies and contribute to timely delivery of user stories and project milestones.
  • Troubleshoot performance issues, optimize queries, and ensure efficient data processing.

You’re excited to build on your existing expertise, including:

  • Overall 5–6 years of experience in data engineering, data warehousing, or ELT/BI development, with 3 years of mandatory hands-on experience in Snowflake development leveraging dbt Cloud & GITLAB.
  • Graduate in Computer Science or Information Management.
  • Proficient in SQL, including complex query writing, performance tuning, and handling large datasets.
  • Strong experience using dbt Cloud to build and maintain modular data models, including developing custom Jinja macros, implementing tests, seeds, snapshots, and managing materializations for scalable data transformations.
  • Proven experience with Snowflake development, including ELT pipelines, data modeling, stored procedures, resource monitors, RBAC, query tuning, and warehouse configuration.
  • Hands-on experience managing API integrations within Snowflake (e.g., external functions, REST API endpoints).
  • Proficient in using GitLab for version control, branching strategies, and CI/CD automation.
  • Knowledge on dbt core to be able to handle the deployments outside of the standard technical infrastructure.
  • Hands-on experience with SnowSQL and Snowpipe for data ingestion, automation, and interaction with Snowflake environments.
  • Strong understanding of dimensional modeling, star and snowflake schemas, and modern data architecture concepts.
  • Working knowledge of cloud platforms (AWS or Azure), especially S3/ADLS.
  • Familiarity with Snowflake’s AI and machine learning functionalities, such as Snowpark ML or Snowflake Cortex, is a valuable advantage.
  • Familiarity with Python-based data engineering tools and utilities for automation, data processing, and workflow orchestration.
  • Proven experience collaborating within Agile/Scrum teams to deliver iterative, high-quality data solutions through continuous development, testing, and feedback cycles.
  • Strong verbal and written communication skills (English) to collaborate with technical and non-technical teams.

Create a better #TomorrowWithUs! 

  • We value your unique identity and perspective and are fully committed to providing equitable opportunities and building a workplace that reflects the diversity of society. Come bring your authentic self and create a better tomorrow with us. 
  • Protecting the environment, conserving our natural resources, fostering the health and performance of our people as well as safeguarding their working conditions are core to our social and business commitment at Siemens. 
  • This role is based in Pune/Mumbai. You’ll also get to visit other locations in India and beyond, so you’ll need to go where this journey takes you. In return, you’ll get the chance to work with international team and working on global topics.