We spend up to 90 percent of our lives in buildings, and we believe that everything people do in life deserves a flawless place to do it. In a world where our fundamental health, safety and wellbeing expectations have been deeply impacted with the anxiety of a new virus, buildings should offer a haven. Ideally, a perfect place to learn. A flawless place to grow. A perfect place to prosper.
While it’s true that today’s buildings should be efficient, reliable and safe – these characteristics alone don’t enable businesses and empower people the way a true smart building can.
How do you craft the future Smart Buildings? We’re looking for the makers of tomorrow, the hardworking individuals ready to help Siemens transform entire industries, cities and even countries. Get to know us from the inside, develop your skills on the job. Join our Pune team to make difference!
What are my Job Responsibilities?
- You should Implement and maintain our data pipelines ingesting data from various data sources and products (e.g., streaming IoT data)
- You should take responsibility for the extension and maintenance of our data analytics platform, of which the data pipelines are a central part.
- You should know the core of this job are data engineering and software engineering, however the job is located within a global data science environment
- You should work closely with global, cross-functional teams like Architects, Data Scientists, DevOps and Product Managers to understand and implement the solution and data requirements
- You should know to track the implementation, drive the integration across various departments, and monitor the usage of data analytics products
- You must make sure the numerous APIs are well maintained, test and validate the environment with end-users
- You should support the team head or other colleagues in upcoming daily tasks where necessary
- You should have the working knowledge of any Content Management System based on single sourcing concept is preferable
What do I need to be Eligible for this Role?
- You should have experience and willingness to explore big data pipeline and compute tooling such as Luigi, Airflow, Beam, Spark, Databricks. When it comes to methodologies, knowledge of agile software development processes would be highly valued
- A University degree in Computer Science or a comparable education, we are flexible as long as a high quality of code is ensured
- The scenario in our team needs an early starter, bringing along a few years of professional experience in software engineering with Python
- As you will be working with these from day one, familiarity with AWS services beyond EC2 (e.g., Fargate, Batch, RDS, SageMaker) is something we expect from applicants
- You have the right attitude, allowing you to navigate within a complex global organization and getting things done. We need a person with an absolute willingness to support the team, a proactive and stress-resistant personality
- There are a lot of learning opportunities for our new team member. An openness to learn about data analytics (including AI) offerings is part of your motivation
- Business fluency in English
Make your mark in our exciting world at Siemens.
This role is primarily based at Pune. However, you may get the opportunity to visit other locations within India as and when the work demands.
We’ve got quite a lot to offer. How about you?
We’re Siemens. A collection of over 379,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow.
Find out more about Siemens careers at: https://new.siemens.com/global/en/company/jobs.html
Company: Siemens Technology and Services Private Limited
Experience Level: Experienced Professional
Full / Part time: Full-time