Email Attachment to Azure Data Warehouse Integration (P&L Automation)


$85.00
Hourly: $85.00 - $85.00

We’re looking for a skilled developer to build an automated data pipeline that pulls weekly Profit & Loss (P&L) statement attachments from a designated email inbox and processes them into our Azure data environment. Key Responsibilities: • Monitor a specific email inbox and extract attached P&L files (Excel or CSV format) • Land the files in Azure Blob Storage or directly into a staging database • Normalize and transform the data into a clean, consistent structure (e.g., pivot/unpivot rows, align columns) • Ensure the dataset is report-ready for downstream consumption • Load transformed data into our Azure SQL Data Warehouse • Push the cleaned data into an existing Azure Analysis Services tabular model for use in Power BI Requirements: • Strong experience with Azure tools (Blob Storage, Data Factory, Logic Apps, or Functions) • Data wrangling and transformation expertise (Python, T-SQL, or ADF mapping data flows) • Familiarity with data modeling and tabular model structures • Ability to build fully automated and repeatable workflows The solution should run on a weekly schedule and require zero manual intervention once deployed.

Keyword: Data Cleaning

Price: $85.0

SQL Microsoft SQL Server Microsoft SQL Server Programming Python

 

Build Discord Clipping Bot

Job Description: We are seeking a skilled and motivated developer to create a custom Discord bot that integrates with social media platforms, specifically Instagram and TikTok, to pull insights and data for our community. The bot should be able to fetch relevant informa...

View Job
Lead Generation Specialist for Environmental Remediation Project

We are seeking a skilled professional to assist in generating leads for a new product that is targeted for use in environmental remediation. The ideal candidate will have experience in B2B lead generation, particularly in sectors related to environmental services (think...

View Job
Build a Data Warehouse: Kafka → PostgreSQL ELT Pipeline w/ dbt + Metabase (Multi-Tenant SaaS)

Project Overview: We are a SaaS company with a multi-tenant architecture and Kafka-based event pipelines. We are looking for a skilled data engineer to implement an ELT system using: - Kafka (event source) - Kafka Connect (for streaming data into a central warehouse) - ...

View Job