Vacancy

Platform Engineer AI

New York / permanent /

Gina Davies-Veness
Consultant
gina.davis-veness@datalogicrecruitment.com
+1 (646) 844 - 5743
APPLY NOW BACK TO VACANCIES

We're working with a fantastic client who delivers actionable, data-driven results to their clients. This Analytics Innovation team is an analytics technology group that is focused on developing a suite of advanced analytics products and supporting business strategy consulting teams to meet the evolving needs of their clients.

 Duties and Responsibilities

 To meet the demand of growth, this firm is looking for a Data Engineer/Analyst who can play several important roles:

1. Collaborate with internal partners to understand business and insight goals, formulate hypotheses, and identify relevant KPIs and diagnostics to pursue.

2. Plan and build complex automated analytics solutions that deliver rapid, high value results using Apache Spark and/or other big data platforms.

3. Investigate, load and transform data sources for use by consulting teams. Manage scheduled data pipelines for frequently updated sources.

4. Identify, evaluate, test, and solve data quality issues and document outcomes.

5. Work with consulting teams and business stakeholders to implement data sources and tools developed by the Analytics Innovation team

 

Qualifications:

 The Data Engineer/Analyst role is a mid-level position for applicants with a passion for working with large data sets and collaborating with diverse teams to solve an ever-changing set of problems. We seek specialists with strong problem-solving skills and a track record of achieving results, as well as a desire for the personal impact that can only be found within a boutique organization. Candidates should have the following qualifications:

 

Required Skills

· 1-4 years’ experience analyzing and transforming data using SQL or Python

· 1-4 years’ experience automating repetitive workflows using Python, SQL, Bash, JavaScript or similar.

· Experience working across multiple platforms and distributed systems such as AWS S3/EC2/Redshift, Azure, Google Cloud Platform, Databricks, Snowflake, Qubole.

· Knowledge of data ETLs and scheduling (Apache Airflow, AWS Glue, DBT, Alteryx)

· Experience working with end users to conduct needs assessment and user testing.

 

Desired skills 

· Experience working with and running ETLs on traditional relational databases – MySQL, PostgreSQL, MSSQL, Oracle SQL

· Experience publishing to BI solutions such as Tableau, Qlik, Looker, or Power BI.

· Knowledge of geospatial data management and analysis

 

Experience/Education

· Bachelor’s Degree (Computer Science, Information Systems or related IT/Engineering field preferred) or equivalent work experience