Don't worry, we can still help! Below, please find related information to help you with your job search.
Bi Data Engineer Jobs
Company | Integriti |
Address | Toronto, Ontario, Canada |
Employment type | CONTRACTOR |
Salary | |
Category | IT Services and IT Consulting |
Expires | 2023-09-07 |
Posted at | 9 months ago |
We are looking for a BI Data Engineer with Azure Data Engineering experience to join our team. As a key member of the Enterprise Data and Advanced Analytics team, this role is instrumental in enhancing data & analytics capabilities by championing industry best practices and delivering high quality reports and training to enhance business decision making. They help translate complex solutions to digestible business-centric outcomes and will do so by building interactive data visualizations using Power BI and sophisticated data models.
They will work with business-unit leaders to identify and prioritize problems that analytics is suited to solve and help identify the data (internal or external) needed to produce the most useful insights. In addition to being responsible for end-to-end solutions that business users can execute on, they also work towards driving adoption among business users. The individual will support and collaborate with data engineers, data scientists and business users and engage in technical support, research and knowledge sharing through the implementation lifecycle of projects and solutions. The ideal candidate is one that strives to evolve through their career and transition into a data scientist role in the near future by taking advantage of the tremendous learning and collaboration opportunity this role provides.
Key Responsibilities:
- The candidate should be well-versed in the concepts and techniques of the Kimball Methodology, which emphasizes building data warehouses that are optimized for business intelligence and reporting. Proficiency in dimensional modeling, including the creation of star schemas and snowflake schemas, is essential
- The candidate should have experience in creating dataflows using Azure Data Factory and familiarity with the Medallion Architecture is highly desirable
- Managing change and communicating impacts to stakeholders and fellow team members
- Deploying ETL pipelines to achieve a high level of reliability, scalability, and security.
- Defining and implementing technical test plans and perform unit and integration testing
- Designing and building interactive insightful data products using Power BI, ML and Azure
- Operating in an Agile development environment
- The candidate should have a strong command of PySpark, a Python library for distributed data processing and analysis
- Knowledge of Postman, the candidate should be able to effectively design, test, and debug APIs, ensuring their functionality and reliability within our software ecosystem
- Designing and developing ETL pipelines to achieve a high level of reliability, scalability, and security. Sourcing, transforming, and delivering structured and unstructured assets for use in Azure Synapse Analytics, Azure Data Factory, SQL Server, Azure Data Lake
- The ideal candidate will possess strong communication and leadership skills to guide and support the team in adopting and implementing industry-leading data engineering practices
- The ideal candidate will have the ability to document technical specifications, workflows, and guidelines, ensuring clear and concise communication across the team and stakeholders
- Identifying, defining, and implementing opportunities for enhancing existing processes
- Developing and monitoring Azure data pipelines, Power BI Premium Dataflows to gather, transform and democratize structured and unstructured data assets
- Leading multiple projects simultaneously and ensuring on-time delivery
Qualifications:
- SQL Server Data Tools
- Azure Synapse & Azure Data Factory
- Industry Knowledge. The ability to quickly propose Azure Data Platform solutions by recalling the latest standard processes learned from MVP & Product Team articles, MSFT documentation, whitepapers, and community publications is required
- Azure Databricks using PySpark
- Breadth of technical experience and knowledge across the Azure Data Platform, with depth Domain Expertise in two or more of the following Data Platform resources is required:
- Experience. 5+ years solving sophisticated technical data projects using the Microsoft Azure data stack is required
- Relationship Building. A proven track record of building deep technical relationships. Experience in setting expectations across various partners.
- Azure Machine Learning
- Excellent written, verbal, and presentation skills targeted at all levels of the organization
- Teamwork. Motivated and keen to work in a collaborative environment with a focus on team success over and above individual success.
- Problem-Solving. Must be able to trace data lineage / workflows and resolve technical issues with minimal supervision. Demonstrated proficiency in understanding and implementing business workflows is a plus.
-
Agente Ou Agent De Prévention De Soir Au Service De La Gestion Des Sentences
By Ministère de la sécurité publique At Montreal, Quebec, Canada 7 months ago
-
Lead Line Cook/Manager On Duty
By Impact Kitchen At Greater Toronto Area, Canada 7 months ago
-
Refinish Tech (Temporary) Jobs
By Boyd Group Services Inc. At Saskatoon, Saskatchewan, Canada 7 months ago
-
Vice President - Treasury
By Boyd Group Services Inc. At Winnipeg, Manitoba, Canada 7 months ago
-
Rock Mechanics Eit Jobs
By WSP in Canada At Greater Sudbury, Ontario, Canada 7 months ago