A major international industrial organisation is looking to hire a BI Data Engineer to work in their Business Intelligence unit, where they will be involved in the modernization of their practises and technologies, as well as working alongside senior stakeholders to understand their key requirements, existing data sources, and to present back options that they believe will allow the data to tell the story. As the organisation migrates to PowerBI, the BI Data Engineer will help with not only the technology transition, but also the shift away from?flat file? reports and towards data presentations focused on dashboards with drill-throughs that allow business partners to monitor by exception and understand what the data is saying at a glance. Gather business requirements for enhancement, customizations, and development of technical requirement documents. Should be able to provide a technical solution Prepare estimates based on the specific requirements and project timelines. Participate in business meetings to comprehend and evaluate the feasibility of the business requirements. Responsible for functional flow gap analysis, solution design, POCs, and demos for the Gaps solution from standard solution sets as well as customised solutions. Should be capable of analysing current ETL processes, as well as defining and designing new systems. Experience with Azure Data Factory is required. Should be able to redefine existing business intelligence systems and make technical/strategic changes to improve them. Demonstrate knowledge of the Power-BI and Microsoft-BI tools. Participate in post-implementation support to provide solutions for production issues or other maintenance activities that may involve the change management process. All code changes should be tested and documented. Unit testing, component integration testing, system integration testing, performance testing, capacity testing, and quality reviews are all part of this. We're looking for someone who is extremely knowledgeable about Power BI, including project and technical implementation. It is necessary to have prior experience with Azure Data Lake storage and its associated technologies (Azure Synapse, Azure Data Factory). Please see the detailed requirements list below. If you believe you meet these requirements, please apply and we will contact you to discuss the role and your experience further. Understanding of SQL code and exceptional skills in writing DAX scripts. Understanding of data visualisation strategies used in dashboards to tell a story. Working knowledge of data documentation. Should have previous experience with Azure Data Lake Storage. Extensive experience implementing Azure Data Factory Pipelines using cutting-edge technologies. Working knowledge of On-premises Data Gateway, data integration, and self-service data preparation. Working knowledge of the Azure Synapse environment. Strong T-SQL skills, as well as experience with Azure SQL DW DevOps Pipelines (CI/CD) and Automation. Good understanding of data warehousing and data modelling concepts Excellent knowledge of unit testing and BI object migration Good understanding of data quality, data cleansing, and data transformation processes. Extensive experience manipulating and analysing large amounts of data. Fact-driven and analytical, with exceptional attention to detail. Excellent business sense. Excellent problem-solving abilities, as well as critical and analytical thinking abilities #4362064 The candidate must have excellent written/verbal communication and facilitation skills.
A major international industrial organisation is looking to hire a BI Data Engineer to work in their Business Intelligence unit, where they will be involved in the modernization of their practises and technologies, as well as working alongside senior stakeholders to understand their key requirements, existing data sources, and to present back options that they believe will allow the data to tell the story.
As the organisation migrates to PowerBI, the BI Data Engineer will help with not only the technology transition, but also the shift away from?flat file? reports and towards data presentations focused on dashboards with drill-throughs that allow business partners to monitor by exception and understand what the data is saying at a glance.
Gather business requirements for enhancement, customizations, and development of technical requirement documents.
Should be able to provide a technical solution Prepare estimates based on the specific requirements and project timelines.
Participate in business meetings to comprehend and evaluate the feasibility of the business requirements.
Responsible for functional flow gap analysis, solution design, POCs, and demos for the Gaps solution from standard solution sets as well as customised solutions.
Should be capable of analysing current ETL processes, as well as defining and designing new systems.
Experience with Azure Data Factory is required.
Should be able to redefine existing business intelligence systems and make technical/strategic changes to improve them.
Demonstrate knowledge of the Power-BI and Microsoft-BI tools.
Participate in post-implementation support to provide solutions for production issues or other maintenance activities that may involve the change management process.
All code changes should be tested and documented.
Unit testing, component integration testing, system integration testing, performance testing, capacity testing, and quality reviews are all part of this.
We're looking for someone who is extremely knowledgeable about Power BI, including project and technical implementation.
It is necessary to have prior experience with Azure Data Lake storage and its associated technologies (Azure Synapse, Azure Data Factory).
Please see the detailed requirements list below.
If you believe you meet these requirements, please apply and we will contact you to discuss the role and your experience further.
Understanding of SQL code and exceptional skills in writing DAX scripts.
Understanding of data visualisation strategies used in dashboards to tell a story.
Working knowledge of data documentation.
Should have previous experience with Azure Data Lake Storage.
Extensive experience implementing Azure Data Factory Pipelines using cutting-edge technologies.
Working knowledge of On-premises Data Gateway, data integration, and self-service data preparation.
Working knowledge of the Azure Synapse environment.
Strong T-SQL skills, as well as experience with Azure SQL DW DevOps Pipelines (CI/CD) and Automation.
Good understanding of data warehousing and data modelling concepts Excellent knowledge of unit testing and BI object migration Good understanding of data quality, data cleansing, and data transformation processes.
Extensive experience manipulating and analysing large amounts of data.
Fact-driven and analytical, with exceptional attention to detail.
Excellent business sense.
Excellent problem-solving abilities, as well as critical and analytical thinking abilities #4362064 The candidate must have excellent written/verbal communication and facilitation skills.