ETL/Python Developer

Published June 10, 2021
Location Charlotte, NC
Category Default  
Job Type Full-time  

Description

Role: ETL/Python Developer

Location: Charlotte,NC

The role is for an Architect / Data Engineer. As an Architect, you will be expected to help the team craft data solutions to meet business and enterprise requirements. While our core stack is currently Informatica / Oracle / SQL, we are exploring new big data methods of moving data such as using PySpark and Python based solutions. This person is expected to bring new technology, knowledge and experience to the team and mentor others team members. The preferred candidate will have previous ETL Experience with Informatica/Oracle and recent experience with newer methods of performing ETL such as using Spark and Python Data Frames.

Required Skills:

  • Minimum of 7+ years of development experience in Oracle, SQL Server, Netezza, or another industry accepted database platform.
  • Minimum of 7+ years in Data Warehouse / Data Mart / Business Intelligence delivery.
  • Minimum of 3+ years with an Industry ETL tool (preferably Informatica PowerCenter).
  • Minimum of 3+ years of Linux / shell scripting (e.g. Bash, Perl, Python).
  • Minimum of 3+ years of Python (e.g. Pandas, Data Frames) and use in data processing solutions.
  • Experience w/d enterprise job scheduling tool (e.g. Autosys, Airflow).
  • Proven experience in designing and building integrations supporting standard data warehousing data models star schema, snow flake and different Normalization Forms.
  • Strong analytical and problem-solving skills. Passion for working with data.
  • Experience with two or more of the following:
  • Experience developing Data Pipeline solutions w/ Python or Spark based methodologies as opposed to industry standard ETL tools.
  • Modern job orchestration tools for data pipelines such as Air Flow.
  • Integrating Rules Engines (e.g. Sapiens) into Data Pipeline workflows.
  • Big Data and/or Emerging data technology tools and methodologies.
  • Kafka, Sqoop, Spark, nifi.
  • Bachelor’s degree in STEM related field.

Desired Skills:

  • 10+ years Data Engineering experience.
  • Banking / Capital Markets / Accounting domain knowledge.
  • Experience automating QA tests as part of the development workflow.
  • Experience in Creating Low-level and High-level Design Artifacts.
  • Ability to present technical concepts to senior level business stakeholders.
  • Excellent Communication skills – verbal and written.
  • Should be a self-motivated worker.
  • Excellent interpersonal skills, positive attitude, team-player.
  • Willingness to learn and adapt to changes.
  • Experience in working in a global technology development model.
  • Effectively deal with multiple deadline-driven, customer-sensitive projects and tasks.
  • Knowledge of agile methodology and frameworks like Scrum, Kanban, etc.
  • Experience working in a SAFe Agile delivery model.
  • Advanced degree.