Principal Data Engineer: Responsible for End-to-End data migration and deployment lifecycle. Implement, deploy and maintain multiple AI projects using di?erent Machine Learning algorithms. Implement data discovery and optimization using cloud and Machine learning models. Implement Snowflake stored procedures on stars data with snow pipes, tasks and streaming process. Kafka streaming from source systems (Azure) to snowflake staging layer. Project Delivery, Resource Management, Project planning and deployment. Provide Machine learning support. Prepare the Testplan and test cases for test Application. Automation to perform data quality validations between Source system to Target system.
Minimum Requirements: Bachelor's degree in science/engineering/information systems related with 5 years of work experience. Position requires expertise in Machine Learning AI, Statistical analysis, RDBMS, Big Data (Hadoop, NoSQL DB, Hive, Cassandra, Kafka Talend Studio), Datawarehouse & cloud (Snowflake, Azure, Informatica), Python programming, & SQL DB.
Travel/Relocation