Software Developer (Teradata 16.x, SQL/BTEQ scripting, TPT, Informatica, Autosys, Cloudera CDP, Hadoop) Location: Plano, TX /Kennesaw, GA/NC Duration: 12 months Description: Responsible for developing complex requirements, enhancing, modifying and/or maintaining applications in the Financial Crimes technology to accomplish business goals. Software developers design, code, test, debug and document programs as well as support activities for the maintaining the application. Work closely with business partners in defining requirements for system applications. Typically requires 10+ years of applicable experience. Utilizes in-depth knowledge of business requirements, business environments, and technological alternatives to recommend innovations that enhance and/or provide a competitive advantage to the organization. Responsible for providing insight and direction from a data perspective, assessment of impact on technology systems and participation in the full development lifecycle of the capability being delivered. Contribute to story refinement/defining requirements. Estimate work necessary to realize a story/requirement through the delivery lifecycle. Perform proof of concept as necessary to mitigate risk or implement new ideas. Setup and automate continuous integration/continuous delivery pipeline. Working closely with Production Support teams, Platform teams and Business Partners, this person handles technical aspects of the application, including Change Management, Maintenance, Platform Upgrades, and Changes to requirements from both upstream and downstream interfacing applications. Demonstrated data sourcing, data analysis and modeling skills with the ability to build innovative data provisioning models to support large scale financial crimes data sourcing initiatives. Work with a team of Data Analysts and Developers related to Data to ensure best practice and governance is followed. Promoting and applying best practices and standards at a project, program level. Partners with the business to develop plans, including ongoing success measures, to sustain the change. Accountable for analyzing present-state, developing alternative future-state approaches and facilitating implementation. Effectively communicates with managers, peers and business partners on deliverables and timelines. Responsible for being Agile and following agile practice. In parallel, ensuring all Enterprise Change Standards are met. Experience in Teradata 16.x, SQL/BTEQ scripting, TPT, Informatica 10.x, UNIX Shell scripting, Autosys, Cloudera CDP, Hadoop (Hive, Impala, Spark, Scala) and Data analysis. Strong Data sourcing, data modelling and provisioning skills as related to support large scale Big Data Applications Database Executed database intensive development, data migrations and conversions. Knowledge and experience working with SQL performance tuning. Experience in query optimization, performance tuning of the complex SQL queries is a must. Good experience in SDLC, Agile, Continuous Integration Continuous Delivery, and change management (CICD) - Jira, Bitbucket, Jenkins, Artifactory, Ansible Ability to clearly communicate with team stakeholders Experience in building large data lakes Good knowledge of Unix and Shell scripting is required Software development in Agile environment. Experience in building a software product/platform/framework from ground up Apache Spark committer Strong Problem Solving and trouble shooting skills Being able to work under pressure and tight deadlines Being able to take pride in writing elegant code with industry standards Design and development experience in modern technologies such as API management, REST/API integration, Containers, Micro services. Experience or familiarity with orchestration tools, such as Airflow. Experience with, and understanding of, peripheral systems (network, load balancing/workload management, application development, operating system), and with monitoring systems (Splunk, ITRS, Introscope, SiteScope, etc.) Experience with Private Cloud and Public Cloud platforms such as Azure, AWS, and Google Cloud Experience in implementing enterprise class Data Lakes/Data Lake houses. Experience in Snowflake, Azure Synapse, AWS Redshift, Data Bricks is preferred. Technology Specialist Skillset: Yes| Informatica Qualification Enterprise Databases: SQL UNIX Shell Scripting Informatica