We spend up to 90 percent of our lives in buildings, and we believe that everything people do in life deserves a flawless place to do it. In a world where our fundamental health, safety and wellbeing expectations have been deeply impacted with the anxiety of a new virus, buildings should offer a haven. Ideally, a perfect place to learn. A flawless place to grow. A perfect place to prosper.
While it’s true that today’s buildings should be efficient, reliable and safe – these characteristics alone don’t enable businesses and empower people the way a true smart building can.
How do you craft the future Smart Buildings? We’re looking for the makers of tomorrow, the hardworking individuals ready to help Siemens transform entire industries, cities and even countries. Get to know us from the inside, develop your skills on the job. Join our Pune team to make difference!
What are my Job Responsibilities?
- You should Take responsibility and ownership for the end-to-end operation of a data analytics environment. Build and maintain a platform that hosts our data driven services, allowing us to process and utilize the wealth of data we have. Mainly a back-end solution, it has a limited front-end side as well.
- You need to Integrate data analytics services to global offerings, by packaging Python code produced by data scientists to be compatible with our R&D infrastructure and target products that will consume the service.
- You have to track the implementation, drive the integration across various departments, and monitor the usage of data analytics products.
- The core of this job is software engineering; however the job is located within a global data science environment.
- You need to Work closely with global, cross-functional teams like Architects, Data Scientists, DevOps and Product Managers to understand and implement the solution requirements.
- You need to Make sure the numerous APIs are well maintained, test and validate the environment with end-users.
What Makes me Eligible for this Role?
- Educational Qualification- A University degree in Computer Science or a comparable education, we are flexible as long as a high quality of code is ensured.
- Experience Required- 3 to 8 Years.
- The scenario in our team needs an early starter, bringing along a few years of professional experience in software engineering, ideally with Python.
- You should have Proven experience with common DevOps practices such as CI/CD pipelines (GitLab), container orchestration (Docker, Kubernetes, Helm) and infrastructure as code (Terraform).
- Along with hands-on DevOps experience, the person must also possess experience in the domain of data analytics or MLOps.
- As you will be working with these from day one, familiarity with AWS services beyond EC2 (e.g., Fargate, Batch, RDS, SageMaker) is something we expect from applicants.
- Initial experience or willingness to explore big data pipeline and compute tooling such as Luigi, Airflow, Beam, Spark, Databricks. When it comes to methodologies, knowledge of agile software development processes would be highly valued.
Make your mark in our exciting world at Siemens.
This role is primarily based at Pune. However, you may get the opportunity to visit other locations within India as and when the work demands.
We’ve got quite a lot to offer. How about you?
We’re Siemens. A collection of over 379,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow.
Find out more about Siemens careers at: https://new.siemens.com/global/en/company/jobs.html
Company: Siemens Technology and Services Private Limited
Experience Level: Experienced Professional
Job Type: Full-time