Job Responsibilities:
Design, optimize, and maintain PostgreSQL database architecture to ensure high performance and stability.
Develop and implement database schemas, indexing strategies, and query optimization to enhance efficiency.
Manage and optimize database clusters, supporting high concurrency and high availability requirements.
Write and optimize stored procedures, triggers, and views to improve database logic execution.
Monitor database performance, analyze query execution plans, and resolve slow queries and deadlocks.
Conduct regular backups, recovery, and data migrations to ensure data security and integrity.
Implement sharding, read-write separation, and distributed architectures for large-scale data processing.
Collaborate closely with backend developers, DevOps, and data analysts to provide database optimization support.
Research and implement database security measures to prevent SQL injection, data breaches, and other risks.
Job Requirements:
Bachelor’s degree in Computer Science or related fields, with 3+ years of PostgreSQL experience (large-scale data management experience preferred).
Strong proficiency in SQL and advanced PostgreSQL features (e.g., indexing, query optimization, and transaction management).
Experience in database design principles, handling high-concurrency and large-scale data scenarios.
Expertise in PostgreSQL high availability solutions, such as Streaming Replication, Patroni, PgPool-II.
Familiarity with database monitoring tools (e.g., pgAdmin, Prometheus + Grafana).
Knowledge of database partitioning, sharding, and read-write separation for optimizing large-scale data storage.
Proficiency in Linux environments and ability to write Shell scripts for database automation.
Understanding of NoSQL databases (e.g., Redis, MongoDB) and cross-database optimization.
Must work on-site in Ho Chi Minh City (no remote work accepted).
Bonus Skills (Not Required):
✅ Experience in game database optimization for high-concurrency environments.
✅ Knowledge of cloud database management (e.g., AWS RDS, Google Cloud SQL, Azure PostgreSQL).
✅ Hands-on experience in big data processing, including ETL, data warehousing, and streaming data processing.
✅ Familiarity with Python, Go, or Java for database automation or data analysis.