Descripción del trabajo
Responsibilities :
* Design, build, and maintain scalable and robust data pipelines to support data integration and data warehousing.
* Develop and optimize ETL processes to ingest, clean, and transform data from various sources like Qualys, CMDB, etc.
* Ensure the reliability, availability, and performance of data systems.
* Data Management :
* Manage and maintain data architecture, data models, and data schemas.
* Implement and maintain data governance and data quality standards.
* Work with relational and NoSQL databases, ensuring data integrity and security.
* Performance Monitoring and Optimization :
* Monitor database performance and optimize query execution for maximum efficiency.
* Troubleshoot and resolve database-related issues.
* Data Integrity and Security :
* Ensure data integrity and security, including managing user access and permissions.
* Develop and implement backup and recovery procedures to minimize data loss in the event of hardware or software failure.
* Cloud-Based Database Management :
* Manage cloud-based databases on platforms like AWS, Azure, and Google Cloud Platform.
* Keep up-to-date with the latest PostgreSQL / MongoDB releases, features, and patches.
* Collaboration and Documentation :
* Collaborate with developers and other IT staff to ensure database systems meet business requirements.
* Document database processes and procedures.
Requirements
* 5-10 years of experience as a Database Administrator / Architect, with specific experience in PostgreSQL and MongoDB along with good Python skills.
* Proficiency in PostgreSQL and MongoDB.
* Strong skills in Python and Ansible for development and automation.
* Familiarity with cloud-based database management (AWS, Azure, Google Cloud Platform).
* Experience in creating and managing databases, tables, and indexes.
* Strong understanding of database performance monitoring and optimization.
* Knowledge of data integrity and security best practices.
* Proficient in developing backup and recovery procedures.
* Ability to analyze and organize raw data and build data systems and pipelines.
* Experience in conducting complex data analysis and building ETL solutions.
* Knowledge of Agile Methodology.
* Good to have knowledge about Vulnerability Management and broader knowledge of cybersecurity (ISO27001 ISMS framework).
* Bachelor’s degree in Computer Science, Information Technology, or a related field.
* Excellent problem-solving and troubleshooting skills.
* Strong collaboration and communication skills.
Benefits :
* Permanent contract.
* Hybrid remote work model to enhance your flexibility.
* Flexible hours to organize your day as you prefer.
* Intensive hours on Fridays to enjoy your weekend.
* Relocation opportunities and support in finding housing.
* Continuous training and professional development programs.
* Compensation meals.
* Special discounts for taking out group employee insurance policies.
* Life and accident insurance.
* Annual medical check-ups.
Don't miss this unique opportunity! If you're interested in being part of a challenging and growing project, send us your CV and discover how you can impact the future of financial technology.
#J-18808-Ljbffr