Search resumes and take the initiative to contact job applicants for higher recruiting efficiency. The Choice of Hundreds of Companies.
I am a lead architect responsible for designing and implementing a large-scale data pipeline for Lomotif, Paktor x 17LIVE, utilizing GCP/AWS/Python/Scala, in collaboration with data science and machine learning teams in Singapore and TW HQ, as well as with the Hadoop ecosystem (HDFS/HBase/Kafka) at JSpectrum in Hong Kong and Sydney.
With over 15 years of experience in designing and developing Java/Scala/Python-based applications for daily operations, I bring:
● At least 8 years of experience in data analysis, pipeline design and development, and tool building as a team member.
● In-depth knowledge of the Spark and Hadoop ecosystems, including Hadoop, HDFS, HBase, and more.Senior Data Engineer at Paktor x 17LIVE| AWS Big Data Specialist | Data Architect
Singapore / Hong Kong / Taiwan
https://www.linkedin.com/in/chin-hung-wilson-liu-29392957
Nanxing Rd., Xizhi Dist., New Taipei City, Taiwan (R.O.C.)
Description and Responsibilities: Lomotif is a leading short video social platform in South America and India that holds PBs of videos in buckets and serves millions of users. DataOps and AI team take part in many challenging projects e.g. Ncanto, XROAD services, Ray Serve, and scalable model serving frameworks for support the recommendation and moderation pipeline, also integrated Universal Music Group music (UMG) and full catalog feed with 7digital. DataOps team handling 10TB+ data for day-to-day operation, moderating model training results, and designing SLIs/SLOs for EKS Clusters. More responsibilities/details as below.
Description and Responsibilities: The main responsibility of the engineering team is launching ScoutAsia by Nikkei and The Financial Times Nikkei content to SGX TitanOTC's platform. Titan Users will be able to access Nikkei news articles from across 11 categories, including equities, stocks, indices, foreign exchange, and iron ore. DPP (Data team) is processing hundreds of GB articles/market/financial/relationships and organization for day-to-day operation on Azure and on-premise environments. More responsibilities/details as below.
Description and Responsibilities: The big challenge of 17 Media data teams is facing fast-growing data volume (processing 5-10x TB level daily), complex cooperation with stakeholders, the cost optimization of the pipeline, and refactoring big latency systems .etc. As a senior data member, I’m making a data dictionary and trying to explain/design how the whole pipeline works with each component, especially how to solve those bottlenecks. More responsibilities/details as below.
Tech Stacks :
Reports to : Data Head
Description and Responsibilities : This is another 0 to 1 story. As an early data member, we need to figure out the data driven policy, strategies, engineering requirements from the company. In Paktor, data / backend sides are 100% on AWS, therefore the whole data ingestion, automation and data warehouse etc. are relying on those components. We are processing 50-100x GB realtime / batch jobs and the other data sources (RDBMS, APIs) for ETL/ELT on S3, Redshift, the data platform helps our marketing / HQ scientists team getting data into insights and making good decisions. More responsibilities / details as below.
Tech Stacks :
Reports to : CTO, Data Head
Description and Responsibilities : JSPectrum is a leading passive location-based service company in Hong Kong which holds many interesting products such as NetProbe, NetWhere, NetAd etc. In Optus (The main project in Sydney), the main responsibility of system analyst is designing / implementing data ingestion (real-time processing) / load and management data with major components of the Hadoop ecosystem. We meet the challenge to process 15,000 TPS, 60,000 inserts per second and 300 GB daily storages, therefore we are trying to optimize those components with Kafka consumers, HDFS storages and re-designing keys / columns of HBase to fulfill the requirement and deployed NetAd, whole in-house solutions on Optus. More responsibilities / details as below.
Tech Stacks :
Reports to : CTO
Description and Responsibilities : TORO is a technology business that provides a mobile platform and its associated systems, services and rules to help Brands (with initial focus on Sports Teams, Smart Cities and Streaming apps) become super-apps to generate additional revenue with minimum effort. Responsibilities as below.
Tech Stacks: MYSQL / Spring / Hibernate / XML / Apache Camel / Java / POJO .etc.
Reports to : Head of Server Solutions
Description and Responsibilities : Digital river proactive partners, providing API-based Payments & Risk, Order Management and Commerce services to leading enterprise brands. The big challenge to DR is integrating with the current module and working well with a huge code base (over 2+ millions lines), the strict process including analysis requirements, design, implement, test and code review. More responsibilities as below.
Tech Stacks: Oracle / Tomcat / Spring / Struts / JDO / XML / JUnit / Java / J2EE .etc.
Reports to : Technical Development Manager
Description and Responsibilities : Stark Technology (STI) is the largest domestic system integrator in Taiwan. We plan and deliver complete ICT solutions for a wide spectrum of industries through representing and reselling the world's leading products. This is made possible by using the most advanced technology, and providing the best professional services. More responsibilities / projects as below.
Tech Stacks : Oracle / Sybase / Tomcat / Weblogic / Spring / Struts / Hibernate / Fatwire / Java / J2EE .etc.
Reports to : Technical Manager
National Taiwan University, 2010 – 2011
EMBA Programs, Business Administration, Accounting, Finance and International Business.
Chinese Culture University Master of Information Management, 2002 – 2005
Computer Science, Data Mining, Expert Systems and Knowledge Base as major concentration.
Chinese Culture University, Bachelor Degree of Science in Journalism, 1998 - 2002
I am a lead architect responsible for designing and implementing a large-scale data pipeline for Lomotif, Paktor x 17LIVE, utilizing GCP/AWS/Python/Scala, in collaboration with data science and machine learning teams in Singapore and TW HQ, as well as with the Hadoop ecosystem (HDFS/HBase/Kafka) at JSpectrum in Hong Kong and Sydney.
With over 15 years of experience in designing and developing Java/Scala/Python-based applications for daily operations, I bring:
● At least 8 years of experience in data analysis, pipeline design and development, and tool building as a team member.
● In-depth knowledge of the Spark and Hadoop ecosystems, including Hadoop, HDFS, HBase, and more.Senior Data Engineer at Paktor x 17LIVE| AWS Big Data Specialist | Data Architect
Singapore / Hong Kong / Taiwan
https://www.linkedin.com/in/chin-hung-wilson-liu-29392957
Nanxing Rd., Xizhi Dist., New Taipei City, Taiwan (R.O.C.)
Description and Responsibilities: Lomotif is a leading short video social platform in South America and India that holds PBs of videos in buckets and serves millions of users. DataOps and AI team take part in many challenging projects e.g. Ncanto, XROAD services, Ray Serve, and scalable model serving frameworks for support the recommendation and moderation pipeline, also integrated Universal Music Group music (UMG) and full catalog feed with 7digital. DataOps team handling 10TB+ data for day-to-day operation, moderating model training results, and designing SLIs/SLOs for EKS Clusters. More responsibilities/details as below.
Description and Responsibilities: The main responsibility of the engineering team is launching ScoutAsia by Nikkei and The Financial Times Nikkei content to SGX TitanOTC's platform. Titan Users will be able to access Nikkei news articles from across 11 categories, including equities, stocks, indices, foreign exchange, and iron ore. DPP (Data team) is processing hundreds of GB articles/market/financial/relationships and organization for day-to-day operation on Azure and on-premise environments. More responsibilities/details as below.
Description and Responsibilities: The big challenge of 17 Media data teams is facing fast-growing data volume (processing 5-10x TB level daily), complex cooperation with stakeholders, the cost optimization of the pipeline, and refactoring big latency systems .etc. As a senior data member, I’m making a data dictionary and trying to explain/design how the whole pipeline works with each component, especially how to solve those bottlenecks. More responsibilities/details as below.
Tech Stacks :
Reports to : Data Head
Description and Responsibilities : This is another 0 to 1 story. As an early data member, we need to figure out the data driven policy, strategies, engineering requirements from the company. In Paktor, data / backend sides are 100% on AWS, therefore the whole data ingestion, automation and data warehouse etc. are relying on those components. We are processing 50-100x GB realtime / batch jobs and the other data sources (RDBMS, APIs) for ETL/ELT on S3, Redshift, the data platform helps our marketing / HQ scientists team getting data into insights and making good decisions. More responsibilities / details as below.
Tech Stacks :
Reports to : CTO, Data Head
Description and Responsibilities : JSPectrum is a leading passive location-based service company in Hong Kong which holds many interesting products such as NetProbe, NetWhere, NetAd etc. In Optus (The main project in Sydney), the main responsibility of system analyst is designing / implementing data ingestion (real-time processing) / load and management data with major components of the Hadoop ecosystem. We meet the challenge to process 15,000 TPS, 60,000 inserts per second and 300 GB daily storages, therefore we are trying to optimize those components with Kafka consumers, HDFS storages and re-designing keys / columns of HBase to fulfill the requirement and deployed NetAd, whole in-house solutions on Optus. More responsibilities / details as below.
Tech Stacks :
Reports to : CTO
Description and Responsibilities : TORO is a technology business that provides a mobile platform and its associated systems, services and rules to help Brands (with initial focus on Sports Teams, Smart Cities and Streaming apps) become super-apps to generate additional revenue with minimum effort. Responsibilities as below.
Tech Stacks: MYSQL / Spring / Hibernate / XML / Apache Camel / Java / POJO .etc.
Reports to : Head of Server Solutions
Description and Responsibilities : Digital river proactive partners, providing API-based Payments & Risk, Order Management and Commerce services to leading enterprise brands. The big challenge to DR is integrating with the current module and working well with a huge code base (over 2+ millions lines), the strict process including analysis requirements, design, implement, test and code review. More responsibilities as below.
Tech Stacks: Oracle / Tomcat / Spring / Struts / JDO / XML / JUnit / Java / J2EE .etc.
Reports to : Technical Development Manager
Description and Responsibilities : Stark Technology (STI) is the largest domestic system integrator in Taiwan. We plan and deliver complete ICT solutions for a wide spectrum of industries through representing and reselling the world's leading products. This is made possible by using the most advanced technology, and providing the best professional services. More responsibilities / projects as below.
Tech Stacks : Oracle / Sybase / Tomcat / Weblogic / Spring / Struts / Hibernate / Fatwire / Java / J2EE .etc.
Reports to : Technical Manager
National Taiwan University, 2010 – 2011
EMBA Programs, Business Administration, Accounting, Finance and International Business.
Chinese Culture University Master of Information Management, 2002 – 2005
Computer Science, Data Mining, Expert Systems and Knowledge Base as major concentration.
Chinese Culture University, Bachelor Degree of Science in Journalism, 1998 - 2002