Sr Data Engineer - SRDAT010627
DESCRIPTION/RESPONSIBILITIES:Position Title: Senior Data Engineer (P4)
Role:Crown Castle's Data & Digital team has been evolving its capabilities across all technology functions, investing in the team, technology, data, and processes. The Data and Digital team is leading innovation by developing microservice and event- based solutions across products, applications, platforms, and data utilizing open source technologies in the cloud to drive Crown Castle's technology evolution.
As a Senior Data Engineer, specializing in Snowflake, Databricks, Kafka, AWS Glue, Airflow, and AWS, you will provide technical leadership in the design and advancement of our data infrastructure. You will in design, build and optimize scalable, performant data pipelines, ensuring data quality and reliability, in support of a variety of data initiatives across the organization.
Responsibilities * Collaborate closely with cross-functional teams including Data Scientists, Analysts, and Software Engineers to understand data requirements and translate them into efficient solutions using Snowflake, Databricks, Kafka, AWS Glue, Airflow, and AWS services. * Design and implement end-to-end data pipelines, leveraging Kafka for real-time data streaming, Snowflake for scalable data warehousing, and Databricks for advanced analytics. * Develop and maintain ETL processes using AWS Glue to facilitate seamless and reliable data extraction, transformation, and loading into Snowflake and Databricks. * Optimize data models and schema designs for improved query performance within Snowflake and Databricks, while ensuring data consistency and integrity. * Implement robust data security measures and access controls in alignment with company policies and industry best practices. * Utilize Airflow to orchestrate and schedule data workflows, ensuring timely and accurate data processing. * Monitor, troubleshoot, and enhance data pipelines, identifying and resolving performance bottlenecks, data quality issues, and other challenges. * Contribute to the evolution of data warehousing strategies, encompassing data partitioning, clustering, and distribution within Snowflake and Databricks. * Monitor industry trends and technological advancements in Snowflake, Databricks, Kafka, AWS Glue, Airflow, and AWS services, and advocate for their effective utilization within the organization. * Provide mentorship and guidance to the data engineering team to help foster growth and facilitate knowledge sharing.
Expectations * Self-motivated individual who can handle ambiguous/undefined problems and think abstractly to deliver results * Demonstrate a strong sense of ownership, urgency, and drive as well as the ability to work well across diverse teams. * Ability to effectively articulate technical challenges and solutions to business users and other technical teams * Ability to develop compelling insights and logical arguments to persuade others * Critical thinking skills (understanding of the relationship between environment, objectives, goals, and priorities that drive optimal decisions) * Demonstrates executive maturity and ability to communicate at all levels of the company * Navigates challenging interactions with candor and through constructive debate to build support and commitment for initiatives
Education/Certifications * Bachelor's degree in Computer Science, information Systems, or related discipline * Master's degree preferred, PhD a plus * Snowflake, Databricks, AWS, or related certifications
Experience/Minimum Requirements * 8+ years data engineering experience focused on Snowflake, Databricks, Kafka, AWS Glue, Airflow, and AWS services. * Snowflake architecture, performance optimization techniques, and best practices. * SQL skills and experience with database design principles. * Hands-on experience in designing, building, and maintaining complex ETL pipelines usin Snowflake, Databricks, Kafka, and AWS Glue. * Familiarity with data warehousing concepts and methodologies. * Real-time data streaming principles and experience with Kafka. * Cloud computing and AWS services, with a focus on data-related services such as S3, Redshift, and Lambda. * Proficiency in scripting and programming languages such as Python, Java, or similar. * Excellent problem-solving skills and the ability to troubleshoot complex data-related issues.
Working Conditions: This is a remote role with the expectation of occasional on-site/in-person collaboration with teammates and stakeholders that may require up to 20% travel.
For New York, Colorado, California and Washington residents - The hiring range offered for this position is $148,000-$170,000 annually. In addition to salary, employees are eligible for an annual bonus of up to 20% of annual salary and restricted stock. Employees (and their families) are eligible for medical, dental, vision, and basic life insurance. Employees are able to enroll in our company's 401k plan. Employees will also receive 18 days of paid time off each year and 12 paid holidays throughout the calendar year.
Equal Opportunity Employer/Protected Veterans/Individuals with DisabilitiesThe contractor will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information. 41 CFR 60-1.35(c)
Equal Opportunity Employer-minorities/females/veterans/individuals with disabilities/sexual orientation/gender identity