Don't worry, we can still help! Below, please find related information to help you with your job search.
Software Engineer - Data
Company | Egen |
Address | Canada |
Employment type | FULL_TIME |
Salary | |
Expires | 2023-08-14 |
Posted at | 9 months ago |
You
You are an innovative technology enthusiast who enjoys building software products and quickly seeing them work in the real world. You like to develop seriously collaborative teams and guide passionate, cross-functional technologists to solve new problems. Even more, you drive results and hold yourself and your teammates to extreme levels of software standards and professional quality while keeping up with new web application tools and technologies. You can agree to disagree with a smile and drive through with results.
Us
We drive valuable Digital Experiences for established enterprises, emerging startups and other companies through our Data Engineering, Analytics, and Application Development services. Our customized enterprise grade solutions enable our partners to achieve improved operational efficiency and deliver improved business outcomes.
Egen's Data Engineering team builds scalable data pipelines using Python, Java, or Scala and AWS. The pipelines we build typically integrate with technologies such as Kafka, Storm, and Elasticsearch. We are working on a continuous deployment pipeline that leverages rapid on-demand releases. Our developers work in an agile process to efficiently deliver high value applications and product packages.
Your Day
As a Sr. Data Platform Engineer at Egen, you will architect and implement cloud-native data pipelines and infrastructure to enable analytics and machine learning on Egen's rich datasets.
- You have implemented analytics applications using multiple database technologies, such as relational, multidimensional (OLAP), key-value, document, or graph.
- You've worked in agile environments and are comfortable iterating quickly.
- You value the importance of defining data contracts, and have experience writing specifications including REST APIs.
- 3-5 years minimum experience in a production level Data Engineering role building pipelines using Python.
- You write code to transform data between data models and formats, preferably in Python or PySpark (bonus points).
- You know what it takes to build and run resilient data pipelines in production and have experience implementing ETL/ELT to load a multi-terabyte enterprise data warehouse.
- Expert knowledge of relational database modeling concepts, SQL skills, proficiency in query performance tuning, and desire to share knowledge with others.
- Experience moving trained machine learning models into production data pipelines.
- Experience building cloud-native applications and supporting technologies / patterns / practices including: AWS, Docker, CI/CD, DevOps, and microservices.
- Expert knowledge of relational database modeling concepts, SQL skills, proficiency in query performance tuning, and desire to share knowledge with others.
- Experience moving trained machine learning models into production data pipelines.
- Experience building cloud-native applications and supporting technologies / patterns / practices including: AWS, Docker, CI/CD, DevOps, and microservices.
-
Agente Ou Agent De Prévention De Soir Au Service De La Gestion Des Sentences
By Ministère de la sécurité publique At Montreal, Quebec, Canada 7 months ago
-
Lead Line Cook/Manager On Duty
By Impact Kitchen At Greater Toronto Area, Canada 7 months ago
-
Refinish Tech (Temporary) Jobs
By Boyd Group Services Inc. At Saskatoon, Saskatchewan, Canada 7 months ago
-
Vice President - Treasury
By Boyd Group Services Inc. At Winnipeg, Manitoba, Canada 7 months ago
-
Rock Mechanics Eit Jobs
By WSP in Canada At Greater Sudbury, Ontario, Canada 7 months ago