This site uses cookies. To find out more, see our Cookies Policy

Lead Data Engineer in Boston, MA at Eastern

Date Posted: 10/9/2018

Job Snapshot

  • Employee Type:
  • Location:
    Boston, MA
  • Job Type:
  • Experience:
    Not Specified
  • Date Posted:

Job Description

When you join Eastern Bank, you join the largest and oldest mutual bank in the country. We pride ourselves in having knowledgeable and experienced professionals that can provide the expertise, personal attention and service our customers deserve.
As the fourth-largest full-service commercial bank in Greater Boston, we are able to offer you unique opportunities for career growth throughout the organization, as well as a competitive compensation and benefits program.
As an inclusive company, we work to ensure that our valued employees are treated fairly, recognized for their individuality, and encouraged to reach their fullest potential. These values have earned us a reputation as a great place to work and provide a strong reason why you should consider a career with us.

The Lead Data Engineer is part of our Data Management team as the lead technical architect, designer and implementer of our enterprise big data ecosystem. This is a hands-on role that requires big picture planning, establishing technical direction, leadership from ingestion to consumption, deep knowledge of the big data/hadoop applications, experience in building and optimizing large-scale data stores, real-time streaming and ETL.  You’ll work with internal data scientists, technical leaders and line of business experts to identify useful data sets and to cleanse, transform and organize them into solutions that provide meaningful business insights.  As a data-innovator, you’ll also help to identify creative ways to leverage data in ways not previously considered.

Responsibilities include:

  • Architect, engineer, design and implement the appropriate Big Data applications and/or technologies that supports the Analytics Platform roadmap.  Provide big picture thinking, innovation and planning which includes creating technical roadmaps, presentations and presenting to senior leaders.  Collaborate with other technical leads to strategize and/or support their/your needs/ideas.  
  • Develop, manage and publish the necessary enterprise architecture artifacts.  Follow SDLC standards and processes for deploying new code releases.
  • Responsible for building, installing, management and operations of our Analytics Platform including but not limited to ingestion, data marshalling to user consumption. Provide hands-on development for continued development of ETL, real-time streaming and organizing data sets.
  • Provide operational support as required while developing innovative approaches driving to ‘lights-out’ management and self-healing.  Develop a plan to transition monitoring and level 1 support of the Analytics Platform operations to the Technology Operations (TechOps) team. 
  • Research new tools/technologies and manage existing/new vendor relationships. 
  • Manage, develop and implement techniques/tools/procedures to ensure the data is secured and protected.  Determine, test and Implement best techniques for data masking or data encryption based on use case.
  • Partner with business lines, data scientists and technology architects to understand business drivers and opportunities to continually evolve and expand the Analytics Platform to provide increasing value to the enterprise. Be a key contributor to our business intelligence user group.   


Job Requirements

Education and Experience 

  • Bachelor’s degree in Computer Science, Engineering or equivalent combination of training and/or experience
  • 8+ years total in the combined areas of  Database, Big Data or Data warehouse

•        Minimum of 3 years’ experience developing in Java/Scala/Python
•        Minimum of 3 years’ in developing scalable and highly concurrent server architecture design in the Big Data ecosystem
•        Minimum of 2 years’ building, installing, managing and operations a Big Data ecosystem (MapR, Cloudera, Hortonworks)
•        Minimum of 3 years’ experience in designing and building a modern, real-time, big data platform (SQL, Hadoop, Kafka, etc.)
•        Minimum of 3 years’ experience building complex ETL pipelines using tools such as SSIS, Talend or Informatica
•        Minimum of 3 years’ experience building distributed ETL systems across a range of platforms and technologies
•        Experience with the Elastic Stack such as ElasticSearch, Logstash, Kibana
•        Requires in-depth experience with ETL design, implementation and management
•        Solid understanding of software development fundamentals, including algorithms, data structures, and design patterns
•        Experience researching new tools and technologies to optimize data integration and analysis
•        Experience using various forms of testing practices, including regression, acceptance, functional, and load testing
•        Experience creating architectural diagrams, presentations and presenting to senior leaders
•        Experiences of Information Security principles and best practices
Additional Skills
•        Experience working to optimize and scale reporting technologies such as Tableau, Qlick or Domo
•        Experience working as a Hadoop administrator (MapR)
•        Experience with Machine Learning (R, Python)
•        Familiar with Hadoop ecosystem technologies such as Talend, Sqoop, Drill, Hive
•        Familiar with cloud-based data environments (AWS, Azure)
•        Familiar with massively parallel processing (MPP) database systems (Vertica, Redshift, etc.)
•        Familiar with NoSQL architecture (MongoDB, Cassandra, HBase)
•        Experiences with encryption and/or masking approached (DataGuise)
•        Familiar with Agile development methodology (JIRA) and code management using Git
•        Experiences in Data Warehousing design principles and Star or Snowflake data models
•        Knowledge of banking operations and practices