A Data engineer with strong knowledge on Data warehousing and has some basics in Big data domain.
• Expert knowledge and multi-year (5+) experience on data transformation, data bases, data governance and data quality management.
• Well-versed in relational database design and experience with processing and managing large data sets (multiple TB scale). (e.g. T-SQL, Microsoft SQL, Oracle)
• Exceptional knowledge of data ware housing solutions (e.g Teradata , Snowlflake)
• Exceptional knowledge in Data Warehousing concepts (Data Modelling, STAR Schema, Snowflake Schema and various Data Warehousing concepts)
• Strong knowledge on Extract Transform & Load concept(any tool is fine e.g. SAS DI Studio, Informatica…etc)
• Profound knowledge in data processing platforms (KNIME preferred, alternatively Alteryx )
• Good knowledge and hands-on experience with programming languages such as Python or SCALA
• Good know-how & experience on Dataops (Data Orchestration / Workflow management & Monitoring Systems) and
suggest improvements / inputs for CI/CD.
• Knowledge of any reporting/BI tool (e.g. Qlik, Tableau, PowerBI, SAS VA etc…)
• Knowledge of Azure cloud based Data Storage and Services like Azure BLOB, AZURE Databricks, Azure Data Factory,
etc., is preferred.
• Hands-on experience working with APIs (e.g. REST, Oracle CRM Environment, SAP, Webscraping technologies) will be a plus
• Knowledge on SAS Code and SAS based tools will be a plus.
• Strong inclination towards working in any BigData projects
• Strong written/communication skill to be work with partners along the globe
Grip Position :