Your Responsibilities;
You will never be a routine development for Analytics platform, that’s why new generation love it here. Below are wrap-up responsibility
- Take Hadoop and Eco-system ,in-house training on-line training or partner training and get certificate
- Build Big Data Analytics Cluster
- Assessment and Integration Existing Customer data source system
- Review or revisit customer data (Data Profiling)
- Design, implement and deploy ETL/ELT to load data into Hadoop
- Implementation of end-to-end Big Data solutions, including data acquisition, storage, transformation, and analysis.
- Knowledge Transfer to customer and operation support team
Your qualifications;
- Bachelor’s degree or higher in Computer Engineering or Computer Science, Mathematics, Statistics or IT related fields
- Passion and enthusiastic
- Positive Thinking
- Good Communication and presentation skill
- Understand Data Warehouse and ETL concept
- Minimum 1 year of developing Python or PySpark or shell script
- Minimum 1 year of building and coding applications using Hadoop components – HDFS, Hive, Sqoop, Flume etc
- Minimum 2 year implementing relational data models
- Minimum 2 year understanding of traditional ETL tools & RDBM
Your special qualifications willhave positive effect with salary ;
- Full life cycle Development
- Minimum 1 year of experience Developing REST web services
- Industry experience one of these (Banking or Financial, Telecom, healthcare, government, )
- Experience leading teams
- Data security method
Interested applicants please click ‘Register‘ or submit your updated resume, present and expected salary together with copy of transcripts, identification card, house registration certificate, and a recent photograph, certificate (if any)
MSyne Innovations Company Limited (MFEC Group)
Chatujak, Bangkok THAILAND
Tel: 02-821-7894 (Contact K. Vipada)