About The Position
The primary role of the Python Developer in our Data team is to build and deploy Python based, Server-less processes that utilize AWS cloud environments and data services (Machine Learning) for customers using innovative automation tools and cutting-edge technologies.
The Python developer should be enjoying optimizing data systems and building them from the ground up. The developer will support new systems designs and migration of existing ones, working closely with solutions architects, project managers and data scientists.
They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products.
The right candidate will be excited by the prospect of building, optimizing or re-designing our customers’ data architecture to support our next generation of products and data initiatives and machine learning systems.
Summary of Key Responsibilities
- Build and Operate the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud (mainly AWS) migration and ‘big data’ technologies
- Optimize various data types ingestion, storage, processing and retrieval from near real- time events, and IoT, to unstructured data as images, audio, video and documents, and in between.
- Keep our customers’ data separated and secure to meet compliance and regulations requirements.
- Build visualization views to provide actionable insights into customer data, its flow within the pipeline
- Work with customers and internal stakeholders including the Executive, Product, Data, Software Development and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Summary of Experience
- B.S.c in Computer Science or Mathematics or Bioinformatic or Information Systems or another quantitative field.
- 2+ year of experience in executing automation development, working with Python tooling and scripting for Data Manipulations
- Experience with Data-Driven environments\products
- Experience with big data tools: Spark, ElasticSearch, Hadoop, Kafka, Kinesis etc.
- Experience with relational SQL and NoSQL databases, such as MySQL or Postgres and DynamoDB or Cassandra.
- Ability to learn various technologies and topics
- Familiarity with the Agile methodology
- Experience with AWS cloud services: EC2, RDS, EMR, Redshift etc.
- Experience with other functional and scripting languages: Java, Scala, etc
- Experience and/or knowledge of Machine Learning processes, Jupyter Notebook
- Experience with AWS or other cloud platforms
- Experience with Server-less platforms (AWS Lambda)
- Experience with at least one of the following: Amazon Web Services, Microsoft Azure, Google Cloud Platform
AllCloud is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis forbidden under federal, provincial, or local law.