Data Engineer - Oil and Gas
About us
We are 10Pearls, an award-winning digital development company, helping business with product design, development, and technology acceleration. We have a culture of innovation, uniquely designed to help businesses transform, digitalize and scale levering digital technology.
Role Overview:
As a Data Engineer in the oil and gas sector, you will design, build, and maintain robust data pipelines and infrastructure to support advanced analytics and machine learning initiatives. Your role ensures data accessibility, scalability, and quality, enabling seamless integration across diverse systems and sources within the organization.
Location: Abu Dhabi – Relocation for one year required (support provided)
Technical Responsibilities:
1. Data Infrastructure Development:
• Design, build, and manage scalable data pipelines to process large volumes of structured and unstructured data.
• Implement data integration solutions for sources such as SCADA systems, IoT devices, seismic data, and production logs.
• Develop ETL (Extract, Transform, Load) workflows to automate data ingestion and transformation.
2. Data Storage and Management:
• Set up and maintain data lakes, warehouses, and databases (e.g., Snowflake, AWS Redshift, Azure Data Lake, Hadoop).
• Optimize storage solutions for fast and efficient access to critical data.
• Ensure compliance with industry data governance standards, including security and privacy protocols.
3. Big Data and Cloud Technologies:
• Leverage big data tools like Apache Spark, Hadoop, and Kafka for processing high-volume data.
• Deploy cloud-based solutions on AWS, Azure, or Google Cloud platforms for scalable data storage and computation.
• Monitor and troubleshoot data processing jobs to ensure reliability.
4. Collaboration with Teams:
• Work closely with data scientists to provide clean, reliable data for machine learning and analytics projects.
• Partner with business analysts and operations teams to understand data requirements and implement tailored solutions.
• Support real-time data streaming for use cases like drilling monitoring and predictive maintenance.
5. Automation and Optimization:
• Automate repetitive data engineering tasks using scripting languages like Python or Bash.
• Optimize database performance through indexing, partitioning, and query optimisation.
• Build monitoring and alerting systems to ensure data pipeline health.
6. Documentation and Compliance:
• Document workflows, schemas, and data models to ensure transparency and ease of maintenance.
• Implement data quality checks and version control to maintain data integrity.
• Ensure adherence to oil and gas industry data usage and storage regulations.
Qualifications:
• Bachelor’s or Master’s in Computer Science, Data Engineering, or a related field.
• 3-5 years of experience in data engineering, with exposure to oil and gas data sources preferred.
• Strong knowledge of SQL, Python, and data pipeline tools (e.g., Apache Airflow, Luigi).
• Experience with cloud platforms (AWS, Azure, Google Cloud) and big data frameworks.
Key Skills:
• Expertise in data pipeline development and ETL processes.
• Proficiency in big data and cloud-based technologies.
• Strong understanding of database design and performance optimisation.
• Ability to work collaboratively with cross-functional teams.
• Excellent problem-solving and analytical skills.
Nice-to-Haves:
• Familiarity with oil and gas-specific software and data types is a plus.
• Exposure to oil and gas data sources preferred.
Create a Job Alert
Interested in building your career at Oiga? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field