Sr. Data Engineer
The Hartford
Sr Data Engineer – GE07BE
We’re determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals – and to help others accomplish theirs, too. Join our team as we help shape the future.
The Hartford is seeking for a Senior Data Engineer to join our Feature Integration team within Data Science Enablement organization. This person is expected to be adaptable and a quick learner to support our Entity Resolution tech stack. The candidate must possess strong communication skills to communicate with the business and other delivery team.
This role will have a Hybrid work arrangement, with the expectation of working in an office (Hartford, CT, Charlotte, NC, Chicago, IL, Columbus, OH) 3 days a week (Tuesday through Thursday).Responsibilities:
Accountable as a Senior Data Engineer for a large-scale data product. Drive End-to-End solution delivery involving multiple platforms and technologies with medium to large, complexity or oversee certain parts of very large complex implementations, leveraging ELT solutions to acquire, integrate, and operationalize data.
Partner with architects and stakeholders to influence and implement the vision of the pipeline and data product architecture while safeguarding the integrity and scalability of the environment.
Articulate risks and tradeoffs of technology solutions to senior leaders with translations as needed for business leaders.
Accountable for data pipeline and product physical solution designs across teams as well as tool recommendations
Accountable for Data Engineering Practices across all the teams involved.
Implement and utilize leading big data methodologies (AWS, Hadoop/EMR, Spark, Kafka, Snowflake and Talend) with cloud/on premise hybrid hosting solutions, on a multi-team/product level.
Knowledge, Skills, and Abilities:
Strong Technical Knowledge (Cloud data pipelines and data consumption products)
Leader and a team player with transformation mindset.
Ability to lead successfully in a lean, agile, and fast-paced organization, leveraging Scaled Agile principles and ways of working.
Guides team to mature Code quality management, DataOps principles, automated testing, and environment management practices to deliver incremental customer value.
Qualifications:
Bachelor’s degree in computer science or related field of study
5 years of experience in Python/Pyspark
5 years in large scale big data engineering experience and designing best practices in Programming, SDLC practices, Distributed systems, Data warehousing solutions SQL and NoSQL, ETL tools, CICD, Cloud Technologies (AWS/AZURE)
3 years of developing and operating production workloads in cloud infrastructure (AWS, Azure, etc)
Preferred Qualifications:
Exposure to AWS best practices
Knowledge of core functional components/services of AWS – compute, storage, Edge, Database, Migration and Transfer, Networking, and Governance.
Compensation
The listed annualized base pay range is primarily based on analysis of similar positions in the external market. Actual base pay could vary and may be above or below the listed range based on factors including but not limited to performance, proficiency and demonstration of competencies required for the role. The base pay is just one component of The Hartford’s total compensation package for employees. Other rewards may include short-term or annual bonuses, long-term incentives, and on-the-spot recognition. The annualized base pay range for this role is:
$113,360 – $170,040
Equal Opportunity Employer/Females/Minorities/Veterans/Disability/Sexual Orientation/Gender Identity or Expression/Religion/Age
About Us (https://www.thehartford.com/about-us) | Culture & Employee Insights (https://www.thehartford.com/careers/employee-stories) | Diversity, Equity and Inclusion (https://www.thehartford.com/about-us/corporate-diversity) | Benefits (https://www.thehartford.com/careers/benefits)