#158917

Hadoop Data Engineer

Location:
Hartford, Connecticut
Job Terms:
Temp-to-Perm
Salary:
Competitive w/ full benefits
Start date:
06/15/2020
Posted By:
PJ Farnham
Date:
06/01/2020

Job Description:

Our fortune client is looking for a talented Solution Engineer. This is one of our top clients and we have been successful in building out entire teams for this organization. This role will be temp to permanent, 40 hours/week, paid on an hourly rate plus very highly subsidized benefits. This role will start working remotely but after covid restrictions are lifted, the goal is to have this person onsite in Hartford, CT.

 

We are looking for a savvy Hadoop Data Engineer to join our growing team of analytics experts. The contractor will be responsible for building and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder. The Hadoop Data Engineer will support our software developers and database architects initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products

Fundamental Components:

• Develops large scale data structures and pipelines to organize, collect and standardize data that helps generate insights and addresses reporting needs.
• Collaborates with other data teams to transform data and integrate algorithms and models into automated processes.
• Uses knowledge in Hadoop architecture, HDFS commands and experience designing & optimizing queries to build data pipelines.
• Builds data marts and data models to support Data Science and other internal customers.
• Analyzes current information technology environments to identify and assess critical capabilities and recommend solutions.
• Experiments with available tools and advises on new tools in order to determine optimal solution given the requirements dictated by the model/use cases

 

Client Description:

Our Client:

  • Makes it their mission to guide people toward better health
  • Is considered by Fast Company to be amongst the most innovative companies
  • Is listed by Forbes as one of the world’s most admired companies
  • This includes altering the products they offer at retail to more health-conscious offerings!

Our client has recently made a move toward a major merger, which means this is a great time to get involved! Two big industry brands coming together.

Working here gets you into a fast-paced environment, with tons of cross-functional working relationships. You’d have exposure to a number of teams related to any project you’re working on, and we sometimes see freelancers convert to permanent employees on teams they’ve had the opportunity to interact with. This is definitely a standout opportunity!

WHY YOU WANT TO WORK THROUGH AQUENT:

We care about your CAREER GOALS.

  • We offer resume & portfolio review + interview prep. You'll feel set for success!
  • Learn for free: https://thegymnasium.com/
  • Who is AQUENT: www.youtube.com/watch?v=5z-n8nfytuM
  • Pay it forward: https://aquent.com/rewards/

Working with AQUENT gets you access to some pretty cool things, including:

  • Subsidized health, vision and dental benefits
  • Access to Fidelity 401(k) and FSA Program
  • Direct deposit for your weekly paycheck
  • Check our Benefits out.  

Requirements:

BACKGROUND/EXPERIENCE desired:

• 3 or more years of progressively complex related experience.
• Has strong knowledge of large scale search applications and building high volume data pipelines.
• Experience building data transformation and processing solutions.
• Knowledge in Hadoop architecture, HDFS commands and experience designing & optimizing queries against data in the HDFS environment.
• Ability to understand complex systems and solve challenging analytical problems.
• Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources.
• Strong collaboration and communication skills within and across teams.
• Strong problem solving skills and critical thinking ability.

SKILL SET desired:
• Hive
• Shell Script
• Unix
• Hadoop Concepts (Sqoop, YARN, MapReduce ,etc.)
• Python