Python Developer

at UNICON International, Inc
Published March 24, 2020
Location Scottsdale, AZ
Category Default  
Job Type Full-time  

Description

We are currently accepting resumes for a Python Developer in Scottsdale, AZ.

Job Summary:

This role will be working in AWS infrastructure, responsible for developing, debugging and deploying data assets. They will be using various AWS service APIs, AWS CLI, CI/CD pipelines. They will be working on migrating data feeds from external sources, into NW AWS data lake, using our standards and security procedures. They will also be responsible for design, developing data schemas to meet the business requirements. Ability to perform data migrations, data manipulations within various storage systems; ability to output the results in several formats (JSON, data feeds, reports etc); ability to perform data manipulations, load, extract from several sources of data into another schema.

Required Skills and Experience:

- Bachelor’s degree in computer science, management information science, or related field

Aptitude:

Understanding of core AWS services, and basic AWS architecture best practices

- Proficiency in developing, deploying, and debugging cloud-based data assets

- Proficiency in SQL, understanding various databases (Aurora, DynamoDB) and ability to design schemas to meet the requirements.

- Expertise in data analysis methodologies and processes and their linkages to other processes in the software development lifecycle

- Experience with integration efforts (packaged and customized applications) from a data analysis perspective

Attitude:

Good Communication skills- both oral and written. Ability to multitask and perform in self-sufficient manner. Works well with the team. Critical thinking skills and attention to detail. Good judgment, initiative, commitment and resourcefulness

Skills:

- 5+ years of hands-on experience Python (Consultant Level), pyspark SQL, EC2, S3, Lambda
- 2+ years of hands-on experience Python (Specialist Level), pyspark SQL, EC2, S3, Lambda
- 5+ years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL and data warehouse solutions
Extensive experience providing practical direction within the AWS Native.
- 4+ Years of Extensive hands-on experience implementing data migration and data processing using AWS services:
- 5+ AutoScaling, Nifi, CDC processing Redshift, Snowflake, RDS, Aurora, DynamoDB, NoSQL, Cloudtrail, CloudWatch, Glue.
- 5+ years of RDBMS experience - Experience in using File Formats and compression techniques
- 3+ Experience working with DevOps tools such as GitLabs, Jenkins, CodeBuild, CodePipeline CodeDeploy, etc.

Preferred Skills and Experience:

- Streaming technologies such as Kafka, Kinesis etc.

UNICON International, Inc. is an Equal Opportunity Employer.

If you are interested in working for an organization where honesty, integrity and quality are among the core principles then click apply today!

Keywords: Python, pyspark SQL, EC2, S3, Lambda, AWS, AutoScaling, Nifi, CDC processing Redshift, Snowflake, RDS, Aurora, DynamoDB, NoSQL, Cloudtrail, CloudWatch, Glue, GitLabs, Jenkins, CodeBuild, CodePipeline CodeDeploy