- Hartford, Connecticut
- Job Terms:
- Competitive w/ full benefits
- Posted By:
- PJ Farnham
Our fortune client is looking for a talented Data Engineer (Hadoop). This is one of our top clients and we have been successful in building out entire teams for this organization. This role will be a 6 month temporary assignment with potential for extension beyond, 40 hours/week, paid on an hourly rate plus very highly subsidized benefits. This role will start working remotely but after covid restrictions are lifted, the goal is to have this person onsite in Hartford, CT.
We are looking for a savvy Data Engineer (Hadoop) to join our growing team of analytics experts. The contractor will be responsible for building and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder. The Hadoop Data Engineer will support our software developers and database architects initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products
• Develops large scale data structures and pipelines to organize, collect and standardize data that helps generate insights and addresses reporting needs.
• Collaborates with other data teams to transform data and integrate algorithms and models into automated processes.
• Uses knowledge in Hadoop architecture, HDFS commands and experience designing & optimizing queries to build data pipelines.
• Builds data marts and data models to support Data Science and other internal customers.
• Analyzes current information technology environments to identify and assess critical capabilities and recommend solutions.
• Experiments with available tools and advises on new tools in order to determine optimal solution given the requirements dictated by the model/use cases
- Makes it their mission to guide people toward better health
- Is considered by Fast Company to be amongst the most innovative companies
- Is listed by Forbes as one of the world’s most admired companies
- This includes altering the products they offer at retail to more health-conscious offerings!
Our client has recently made a move toward a major merger, which means this is a great time to get involved! Two big industry brands coming together.
Working here gets you into a fast-paced environment, with tons of cross-functional working relationships. You’d have exposure to a number of teams related to any project you’re working on, and we sometimes see freelancers convert to permanent employees on teams they’ve had the opportunity to interact with. This is definitely a standout opportunity!
WHY YOU WANT TO WORK THROUGH AQUENT:
We care about your CAREER GOALS.
- We offer resume & portfolio review + interview prep. You'll feel set for success!
- Learn for free: https://thegymnasium.com/
- Who is AQUENT: www.youtube.com/watch?v=5z-n8nfytuM
- Pay it forward: https://aquent.com/rewards/
Working with AQUENT gets you access to some pretty cool things, including:
- Subsidized health, vision and dental benefits
- Access to Fidelity 401(k) and FSA Program
- Direct deposit for your weekly paycheck
- Check our Benefits out.
• 3 or more years of progressively complex related experience.
• Has strong knowledge of large scale search applications and building high volume data pipelines.
• Experience building data transformation and processing solutions.
• Knowledge in Hadoop architecture, HDFS commands and experience designing & optimizing queries against data in the HDFS environment.
• Ability to understand complex systems and solve challenging analytical problems.
• Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources.
• Strong collaboration and communication skills within and across teams.
• Strong problem solving skills and critical thinking ability.
SKILL SET desired:
• Shell Script
• Hadoop Concepts (Sqoop, YARN, MapReduce ,etc.)
As well as, Kafka and NIFI for skillsets and experience with Azure and GCP (Cloud)!