Hadoop Platform Engineer
Our Bangkok team is looking for top quality passionate engineers to build products across our next gen data platform products.
- Our systems scale across multiple data centers, totaling a few million writes per second and managing petabytes of data. We deal with problems from real-time data-ingestion, replication, enrichment, storage and analytics. We are not just using Big Data technologies; we are pushing them to the edge.
- In this competitive world of online travel agencies, finding even the slightest advantage in the data can make or break a company. That is why we put data systems in our top priorities.
- While we are proud of what we built so far, there is still a long way to go to fulfill our vision of data. We are looking for people like you who are as excited about data technology as we are, to join the fight.
You can be part of designing, building, deploying (and probably debugging) products across all aspects of our core data platform products.
Why Agoda Hadoop Platform Team?
When you join this team, the primary function of the Hadoop Platform is to run multiple Hadoop clusters across multiple data centers, serving teams across all of Agoda, serving large numbers of concurrent jobs at any one-time processing Petabytes of data for any and all purposes ranging from Machine Learning, BI, Cubes on Hadoop, Advertising, etc.
To do this successfully, you will adopt an approach of infrastructure as code, you will be automating everything from deployment to monitoring and remediation, with a huge focus on monitoring, which will provide you the ability to quickly identify and fix issues whether it be infra, data or job related.
Day to Day:
- You will manage, administer, troubleshoot and grow multiple Hadoop clusters
- You will build automated tools to solve operational issues
- You will run effective POC’s on new platform products that can grow the list of services we offer
- You’ll probably hold a bachelor’s degree in Computer Science / Information Systems / Engineering / related field
- You'll be skilled with Bash and at least one other scripting language (Python, Perl, etc.)
- You'll have a deep understanding of data architecture principles
- You'll be experienced in solving problems and working with a team to resolve large scale production issues
- You'll be experienced with configuration management systems (e.g. Puppet, Chef, Saltstack, etc) and version control (eg. git, svn)
- You'll have experience with Linux HA, clustering and load balancing solutions
- You'll have great multi-tasking skills
- You’ll have proficient English oral and written communication skills
Nice to Haves:
- Experience installing, administering and managing Hadoop clusters (Hadoop2 with YARN)
- Experience with Cloudera Hadoop (CDH)
- Experience with Hadoop query engine (Hive, SparkSQL, Impala)
- Experience with modern languages preferably using JVM-based languages (Java, Scala)
- Solid experience in JVM tuning
- Experience working with open source products
- Working in an agile environment using test driven methodologies
With Agoda you can grow rapidly as an engineer.
- You will work with top Hadoop engineers
- Have the ability to use and expand your experience
- Have a big impact on the business.
Some tech you will use:
Hadoop, Spark, Hive, Impala, Scala, Avro, Parquet, Sensu, ElasticSearch, Python, Django, Postgres,
If that’s the kind of team you want to join, let’s talk!
Agoda is an equal opportunity employer and values diversity. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status or disability status.
Please note this role open to local and international applications. Full visa sponsorship and relocation assistance available.