Unfortunately, this job posting is expired.
Don't worry, we can still help! Below, please find related information to help you with your job search.
Some similar recruitments
Senior Architect [Administrative Staffing]
Recruited by CareerBeacon 8 months ago Address Halifax, Nova Scotia, Canada
Lead Data Architect (Director Level)
Recruited by John Hancock 8 months ago Address Ontario, Canada
It Data Scientist, Senior
Recruited by ETS Educational Testing Service Canada, Inc 8 months ago Address Kingston, Ontario, Canada
Senior Data Scientist Jobs
Recruited by Boehringer Ingelheim 8 months ago Address Burlington, Ontario, Canada
Remote Senior Solutions Architect
Recruited by TheoremOne 9 months ago Address Ontario, Canada
Data Architect / Modeller Jobs
Recruited by YDC Pro Consulting 10 months ago Address Ontario, Canada
Data Architect Jobs
Recruited by Government Entity 10 months ago Address Ontario, Canada
Data Architect Jobs
Recruited by City of Vaughan 10 months ago Address Vaughan, Ontario, Canada
Senior Data Scientist Jobs
Recruited by Just Eat Takeaway.com 10 months ago Address Ontario, Canada
Senior Architect Jobs
Recruited by EastPoint 10 months ago Address Halifax, Nova Scotia, Canada
Senior Engineering Manager, Data Platform
Recruited by ClickJobs.io 11 months ago Address Burlington, Ontario, Canada
Senior Technology Architect Jobs
Recruited by Cyber Experts Corporation 1 year ago Address Kingston, Ontario, Canada

Senior Data Architect Jobs

Company

AMANST Inc.

Address Ontario, Canada
Employment type CONTRACTOR
Salary
Expires 2023-07-31
Posted at 10 months ago
Job Description

AMANST Inc. is looking forSenior Data Architect/Modellerfor a contract opportunitywith Ontario Government.


Requisition# RQ05661

Estimated Business Days:220 days


Submission deadline: 7th July 2023 at 12:00 pm

Location:Remote (Ontario only)

Must Haves

  • Experience gathering user requirements to develop a solution for data scientists
  • 5 years + in depth technical knowledge, expertise, and experience with Azure Databricks including but not limited to workspace, network, cluster, notebooks, DBFS etc.
  • Basic knowledge and experience with Azure DevOPS
  • Demonstrated skills in Python, RStudio, financial, business intelligence and data visualization development experience
  • Demonstrated team success and for team focused delivery for meeting deadlines, managing competing priorities and client relationship management experience

Nice to have

  • Data Scientist work experience gained by working with a Forest Industry company or completed a graduate degree with an academia/university environment is desirable but not mandatory
  • Skills to analyze field data and incorporate to develop various predictive models and develop associated statistical reporting and confusion matrices d
  • Public Sector Experience
  • Skills, knowledge, and experience with RStudio, GitHub, other opensource tools as well as Python Geospatial library knowledge are desirable but not mandatory

Responsibilities:

  • Support product backlog grooming, task identification and effort estimation
  • Attend scrum calls and provide timely updates and raise in case ofany blocker
  • Work with project business team to analyze the business’s Databricks programming requirements to design scalable and efficient processing and pipelines within the Azure storage environment and optimized based on the ingestion of large field, optical and lidar datasets
  • Work with business to operationalize workflows in RStudio, LAStools, python, Tableau, Artificial Intelligence, Machine Learning, and Internet of Things (IOT, Microsoft Planetary, Google Earth Engine, etc.) data models into a Databricks notebook and using Azure Data Pipelines
  • Develop code/script to automate the business workflow, test and implement solution components by using Databricks, standard Azure services,.NET for scripting integration work
  • Utilize modernized BI (Business Intelligence)to provide project team with reports/analytics on resource usage, cost allocation, performance metric, and meaningful insights.
  • Provide training and mentoring to the project team members. Possesses good communication skills and the ability to effectively transfer knowledge to team members
  • Consult the tech Lead on all IT solution designs and options before presenting them to the rest of the team to ensure compliance with OPS policies and standards
  • Work with project IT (Information Technology) team to refine data engineering, processing, modeling, and machine learning tasks within the Databricks data scientist platform. Tasks will include, but are not limited to designing data pipelines, optimizing data storage, and processing, and integrating Databricks with other components of the system into Azure Functions, Storage, Azure Data Factory, SQL DB, and Cosmos DB
  • Provide Knowledge transfer/training to the business and technical team
  • Comply and follow all OPS IT policies, standards, and processes e.g. AODA (Accessibility for Ontarians with Disabilities Act), GO IT Standards
  • Supply ability in the design, delivery, evaluation and maintenance of leading-edge user experience strategies and online experiences for the business unit’s client (Forest Industry) and stakeholders (internal ministries), and the public (including research, Federal ministries, and other ministry users).using the government's official websites
  • Provide timely update to the project manager and project team
  • Work with IT team to monitor the performance and health of Azure Databrick resources and resolve issues proactively; and to ensure efficient scaling based on workload demands.
  • Pick up OPS (Ontario Public Service) laptop from a downtown office in Toronto and return the laptop to the same location upon completion of the contract
  • Work with project team (IT and business) to assess resource requirements, estimate costs, and supply correct financial projections.
  • Lead and conduct multiple concurrent research, content design and strategy projects to meet the needs of all OPS (Ontario Public Service) clients involved in forest management planning and resource management activities

Evaluation Criteria

Technical skills 70%

  • Solid understanding of data science principles and machine learning algorithms for application in geospatial and resource management scenarios Able to aid with tasks such as data exploration, feature engineering, model development, and model deployment on the Databricks platform
  • End to end responsibility of design, documentation, development, testing and deployment with Azure Databricks
  • Proficient in optimizing Databricks workloads for performance and scalability. Capable of tuning Spark configurations, optimizing data processing operations, and troubleshooting performance issues
  • Experience analyzing business requirements, design scalable and efficient Databricks architecture and coding with Python and R scripts as a data scientist
  • Skilled in data engineering techniques, including data ingestion, transformation, API (Application Programming Interface), data streaming, and open-source code and tool integration. Have experience working with various data sources and formats and be able to implement data pipelines using Azure Data Factory and Databricks with the context of remote sensing and resource management techniques
  • Expert-level knowledge of cost control strategy, usage pattern analysis, and cost-saving measures.
  • Experience working with Azure Databricks to build data models, implement machine learning tasks as a Databricks expert
  • Experience with or ability in remote sensing, sustainable resource management, Ontario’s Forest and Resource Industry, the forest resources inventory program (FRI) would be beneficial
  • Strong problem-solving skills to diagnose and resolve issues during the implementation and maintenance of Databricks solutions. Capable of identifying bottlenecks, troubleshooting errors, and proposing workflow optimization solutions
  • Hand-on experience of generating comprehensive reports and analytics on resource usage, cost allocation, and performance metrics on DataBricks and Azure cloud servic
  • Working experience in designing data pipelines, optimizing data storage, and processing, Spark (parallel processing), Delta Table and Lake, Cosmos DB, Docker, and object-based and/or pixel image analyses and integrating Databricks with other components of the system, such as Azure Data Factory and Azure SQL DB
  • Skills and experience working with modern geospatial raster and point cloud GPS (Global Positioning Systems) data products process and analysis
  • Knowledgeable about best practices for data governance, security, and compliance on the Databricks platform. Should provide guidance on data privacy, access control, and auditing Knowledge of user interface design principles and best practices

Soft skills 30%

  • Ability to integrate with a skilled business unit, and LRC technical team
  • Excellent communication skills, both written and verbal
  • Excellent analytical, problem solving and decision-making skills
  • Excellent meeting facilitation skills to gather requirements
  • Experience reporting progress on deliverables to team, project leads and management, including proactively raising risks/issues with mitigations
  • Strong stakeholder management skills
  • Experience with agile methodology