Cloud & Data Engineer
¼Ò¼Ó/ÆÀ : BEES, Technology&Analytics / Data & Online Journey
Æ÷Áö¼Ç ¸í : Cloud & Data Engineer
±Ù¹«Áö : »ï¼ºµ¿ ¾Æ¼ÀŸ¿ö
°í¿ë ÇüÅ : Á¤±ÔÁ÷ (3°³¿ù ¼ö½À±â°£ / 100% ±Þ¿©Áö±Þ)
People Managing : N
We are seeking a Cloud & Data Engineer to support the development and operation of scalable data and cloud infrastructure across different data cloud platforms (GCP, Azure Databricks, ¡¦). This entry-level role combines elements of both data engineering and cloud engineering, making it ideal for someone eager to grow their skills in both areas.
You will collaborate closely with local and global colleagues with variety of stakeholders—including infrastructure managers, security teams, data analysts, governance leads, and product managers—to help build secure, reliable, and efficient cloud-based data solutions. Strong communication and collaboration skills are also essential.
¢º Key Accountabilities & KPI (ÁÖ¿ä¾÷¹«)
[Cloud Engineering (Primary Focus)]
Lead provisioning and management of cloud infrastructure in GCP and Azure, including compute, storage, networking, and security components.
Design and maintain CI/CD pipelines for deploying cloud infrastructure and data services.
Implement cloud cost optimization, and manage monitoring/alerting through tools like Azure Monitor, GCP Cloud Monitoring, and Datadog.
Ensure compliance with security best practices, including IAM, data encryption, and firewall configurations.
[Data Engineering (Basic Understanding)]
Possess foundational skills in Python, SQL, and PySpark, sufficient for basic data transformation and validation tasks.
Understand the concepts of data lakes and data warehouses, and how they integrate into cloud architectures.
Able to support simple batch or streaming data pipelines using tools like Azure Data Factory or Databricks.
Collaborate with data teams to assist in data ingestion, monitoring, and troubleshooting within pipelines.
¢º Qualifications (ÀÚ°Ý¿ä°Ç)
Bachelor¡¯s degree in Computer Science, Data Engineering, Information Systems, or a related field (or equivalent hands-on experience).
Basic understanding of cloud platforms (GCP and/or Azure), especially storage, compute, and IAM concepts.
Foundational experience with Python, SQL, and PySpark.
Interest or exposure to infrastructure as code (e.g., Terraform, Bicep).
Strong proficiency in English, both written and spoken.
Comfortable working in a global, collaborative, and remote-friendly environment.
Eagerness to learn and grow in both data engineering and cloud infrastructure domains.