Data Platform Engineer
Don't Risk It
- Scan your CV for errors before BlackStone eIT sees it
- Get AI-rewritten bullet points
- Download Gulf-ready CV
60 seconds. $3.99 one-time.
BlackStone eIT is looking for a skilled Data Platform Engineer to join our dynamic team. You will be responsible for designing, building, and maintaining scalable data platforms that support the company’s data processing and analytics needs.
In this role, you will collaborate with data engineers, analysts, and other stakeholders to develop efficient data pipelines, ensure data quality, and contribute to the overall data infrastructure architecture. This is a fantastic opportunity to advance your career by working with cutting-edge technologies and helping shape BlackStone eIT’s data capabilities.
Key Responsibilities
Required Skills & Technologies
- Python — async data pipelines, background jobs, scripting
- PostgreSQL — schema design, migrations (Alembic), query optimization
- Azure Data Lake Storage + Synapse Analytics
- dbt — transformation, testing, documentation
- Apache Airflow or Azure Data Factory
- Data quality frameworks (Great Expectations, dbt tests, or custom)
- Observability — structured logging, alerting, Azure Monitor or Prometheus/Grafana
- Microsoft Graph API — SharePoint, M365 data extraction
- Redis — queue management, caching
- Docker — containerized pipeline jobs
- SQL — advanced analytical queries, window functions, performance tuning
• Develop and maintain data pipelines for ingestion, processing, and storage of large datasets
• Implement ETL/ELT processes to transform raw data into usable formats
• Collaborate with cross-functional teams to understand data requirements and deliver solutions
• Ensure data quality, consistency, and reliability across platforms
• Optimize database performance and monitor platform health
• Assist in the design and implementation of data governance and security measures
• Document data infrastructure and processes for operational clarityRequirements
• Bachelor’s degree in Computer Science, Information Technology, or a related field
• 3+ years of experience in data engineering or data platform development
• Proficiency in SQL and experience with relational databases like PostgreSQL or MySQL
• Familiarity with data pipeline and ETL tools such as Apache Airflow, Azure Data Factory, or similar
• Experience with cloud platforms (AWS, Azure, or Google Cloud)
• Knowledge of Python or another programming/scripting language for data processing
• Understanding of data security, governance, and quality best practices
• Strong analytical and problem-solving skills
• Good communication skills and ability to work effectively in a team environment Benefits
• Paid Time Off
• Performance Bonus
• Training & Development
Requirements
- •Bachelor’s degree in Computer Science, Information Technology, or a related field
- •3+ years of experience in data engineering or data platform development
- •Proficiency in SQL and experience with relational databases like PostgreSQL or MySQL
- •Familiarity with data pipeline and ETL tools such as Apache Airflow, Azure Data Factory, or similar
- •Experience with cloud platforms (AWS, Azure, or Google Cloud)
- •Knowledge of Python or another programming/scripting language for data processing
- •Understanding of data security, governance, and quality best practices
- •Good communication skills and ability to work effectively in a team environment
Nice to Have
- •Experience with Python for async data pipelines, background jobs, scripting
- •Experience with PostgreSQL schema design, migrations (Alembic), query optimization
- •Experience with Azure Data Lake Storage + Synapse Analytics
- •Experience with dbt for transformation, testing, documentation
- •Experience with Apache Airflow or Azure Data Factory
- •Experience with Data quality frameworks (Great Expectations, dbt tests, or custom)
- •Experience with Observability tools (structured logging, alerting, Azure Monitor or Prometheus/Grafana)
- •Experience with Microsoft Graph API for SharePoint, M365 data extraction
Responsibilities
- •Develop and maintain data pipelines for ingestion, processing, and storage of large datasets
- •Implement ETL/ELT processes to transform raw data into usable formats
- •Collaborate with cross-functional teams to understand data requirements and deliver solutions
- •Ensure data quality, consistency, and reliability across platforms
- •Optimize database performance and monitor platform health
- •Assist in the design and implementation of data governance and security measures
- •Document data infrastructure and processes for operational clarity
Related Jobs
- Scan your CV for errors before BlackStone eIT sees it
- Get AI-rewritten bullet points
- Download Gulf-ready CV
60 seconds. $3.99 one-time.