Senior Data Engineer (MySQL, PostgreSQL)
We are seeking an experienced Senior Data Engineer (MySQL, PostgreSQL) to play an essential role in implementing and maintaining a data warehouse. A career in Exadel means you will work alongside exceptional colleagues and be empowered to reach your professional and personal goals.
Work at Exadel - Who We Are
Since 1998, Exadel has been engineering its products and custom software for clients of all sizes. Headquartered in Walnut Creek, California, Exadel has 2,000+ employees in development centers across America, Europe, and Asia. Our people drive Exadel’s success and are at the core of our values.
About the Customer
The world's largest publisher of investment research. For over two decades it connects the world's leading asset and wealth managers with nearly 1,000 research firms in more than 50 countries and serves internal teams across multi-national corporations from its offices located in Durham (HQ), New York, London, Edinburgh, and Timisoara.
The client facilitates the equitable exchange of critical investment insights by improving efficiency, collaboration, and security across the complete information lifecycle. The ecosystem is designed to meet users' bespoke needs, from compliance tracking to interactive publishing, by removing friction from the publication, dissemination, consumption, and application of investment research content.
Requirements
- 5+ years of background in working with MySQL and PostgreSQL in production environments
- Strong expertise in query optimization, indexing strategies, execution plan analysis, and performance tuning
- Competency in database replication, failover strategies, and high-availability architectures
- Hands-on experience with AWS database services, including RDS, Aurora, and DynamoDB
- Proficiency in troubleshooting database performance issues and implementing tuning strategies
- Familiarity with database integration challenges and experience working with Indigo Data Service or similar solutions
- Skills in monitoring and alerting tools like Prometheus, Grafana, or CloudWatch for real-time database performance tracking
- Practice in SQL, Python, or Bash for automation of database operations
- Familiarity with schema design, normalization, and data modeling best practices
Nice to Have
- Skills in working with Snowflake and cloud-based data warehouses
- Knowledge of NoSQL databases such as MongoDB, ClickHouse, or BigQuery
- Exposure to containerized database deployments using Kubernetes and Docker
- Experience with CI/CD pipelines for database schema changes and migrations
English level
Upper-Intermediate
Responsibilities
- Optimize query performance and fine-tune database configurations for MySQL to enhance system efficiency
- Resolve database integration issues by decoupling dependencies and leveraging the Indigo Data Service for improved failover strategies
- Analyze and mitigate single points of failure within the current database architecture, ensuring high availability and fault tolerance
- Improve indexing strategies, caching mechanisms, and partitioning techniques to reduce query execution times
- Monitor and troubleshoot performance bottlenecks, identifying and addressing slow queries, deadlocks, and resource contention
- Implement database scaling strategies, including read replicas, sharding, and horizontal scaling in AWS environments
- Enhance database security and compliance by implementing best practices for access control, encryption, and backup strategies
- Work closely with software engineers, DevOps, and data teams to ensure seamless database integration and high-performance data access
- Deploy and manage database solutions on AWS, optimizing for cost efficiency and scalability
- Implement automation for database maintenance tasks, including schema migrations, backups, and failover management
Apply for this job
*
indicates a required field