Key Skills:
- Azure Data Factory (ADF): Proficiency in building data pipelines using Azure Data Factory, particularly for ETL/ELT processes from D365 F&O and other systems to the data lake.
- Linked Services: Connecting ADF with D365 F&O, other databases, and external sources.
- Data Flows: Implementing complex data transformations using ADF’s data flow capabilities.
- Real-Time Data Integration: Experience with real-time data ingestion and processing using tools like Azure Stream Analytics, especially for pushing data into the data lake in near-real-time.
- Databricks or Synapse Analytics: Knowledge of using Azure Databricks or Synapse for data processing, transformation, and analytics within the data lake environment. This includes working with large-scale distributed data processing.
- Data Storage Optimization: Skills in optimizing data storage within the data lake, including choosing the right storage tiers (e.g., hot, cool, archive) and compressing large datasets.
- Data Lake & Delta lake File Formats: Familiarity with working with optimized file formats such as Parquet, Avro, or Delta Lake for efficient querying and data storage.
- Power BI service for scheduled refreshes, Azure Data Factory for orchestrating data refreshes.
- Monitoring and Performance Tuning: Experience with monitoring data pipelines, troubleshooting performance bottlenecks, and optimizing for cost-effective usage of Azure resources.