Job Title: Big Data Engineer
Location: Hartford CT
Duration: 12 months /PermanentJOB DESCRIPTION
Interview mode Video
Must have GCP / BigQuery. Snowflake would be great.
- Develops large scale data structures, pipelines and efficient ETL (extract/load/transform) workflows to organize, collect and standardize data that helps generate insights and addresses reporting needs.
- Collaborates with other data teams to transform data and integrate algorithms and models into automated processes.
- Uses knowledge in Hadoop architecture, HDFS commands and experience designing & optimizing queries to build data pipelines.
- Builds data marts and data models to support Data Science and other internal customers.
- Analyzes current information technology environments to identify and assess critical capabilities and recommend solutions.
- Experiments with available tools and advises on new tools in order to determine optimal solution given the requirements dictated by the model/use cases
- 3 or more years of progressively complex related experience.
- Has strong knowledge of large scale search applications and building high volume data pipelines.
- Experience building data transformation and processing solutions.
- Knowledge in Hadoop architecture, HDFS commands and experience designing & optimizing queries against data in the HDFS environment.
- Ability to understand complex systems and solve challenging analytical problems.
- Experience and proficiency with GCP cloud servers, Snowflake and BigQuery
- Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources.
- Strong collaboration and communication skills within and across teams.
- Strong problem solving skills and critical thinking ability.
SKILL SET desired:
- Shell Script
- Hadoop Concepts (Sqoop, YARN, MapReduce ,etc.)
If you are interested to apply for above position send your updated resume to email@example.com