Search resumes and take the initiative to contact job applicants for higher recruiting efficiency. The Choice of Hundreds of Companies.
Position as a system analyst where my professional knowledge in system analysis, data engineering, data analysis, language ability and personality in a company can contribute.
[email protected] 0900263385 Kaohsiung City, Taiwan
March 2023 - Present
Achievemnts:
1. Up the ETL execution efficiency to 50%.(from 5 hours to 2.5 hours).
2. Led APP team to change the data access method from 19 VIEWs to 9 APIs in 2 months.
3. Data Center Owner reviewed the ETL code and pipeline design and kept 95% good-quality for a whole year.
4. Not only built up Gitlab server and source code version control to the company but also code reviewed and defined released code version.
5. Using Python to develop an automatically inform function to monitor ETL jobs.
6. Plan the Data Analysis POC activity for selection the appropriate BI manufacturers to the company.
7. Clean up the data flow of the ODS and Data Warehouse, including the relationship between table and ETL jobs and data lineage.
8. Design a specific checklist of over 20 inspection items to review data or program quality then reduce 3 workdays * 3 people * $daily maintain cost * 107 deploy times * 95% good-quality running costs. (at least saved 6.7 million cost one year)
9. Passed Microsoft Certified; Azure Database Administrator Assoicate (Certificate number: EA5F98-6C2B4Y)
Job description:
1. The Big data platform owner, lead the BI-team to maintain the ETL pipelines and provide the stable and correct data service.
May 2019 - February 2023
Achievement:
1. Deal with over 2 billion records with hive SQL on Hadoop in one hour
2. Optimize SQL query of Hadoop, performance speed up 30% (more than 2 million records)
3. Design a checklist of over 30 inspection items to review data or program quality then reduce 5 workdays * 5 people running costs.
4. Release high quality data inventory / SPEC documents of over 300 tables in 2 years.
5. Introduce VS Code all in one replace multi putty-windows, git-bash, and ultra-editor to a convenient development environment.
6. 2022 KPI is top 40% in 22 people of a group.
Job description:
1. Design and execute ETL (shell, Informatica, SQL) pipeline solutions that ensure data loaded is precise and fast into data warehouse.
2. Verify data dependency is correct and consistent with data owner.
3. Design the data collection and refreshment mechanism for rerunning jobs to replace, append or upsert data automatically.
4. Create shell script to execute batch job automatically with IBM TWS (Task Work Scheduler).
5. Data migration of heterogeneous databases (ex: Teradata, MS SQL, Oracle, DB2, Hadoop).
6. Coach intern for ETL.
7. Complete Tableau data cleaning, data analysis and data visualization basic training.
Skills: Python, SQL, Git, Hadoop, Teradata, Shell scripts, Informatica
February 2017 - May 2019
Achievement:
1. Introduced Bureau of Labor Insurance over 200,000 data every 2 months to validate Insure state.
2. Integrated 6 banking account data formats, including 5 farmers' association Information Center and Agricultural Finance Bureau into system for dealing with insurance fee business.
3. Designed an EASY and FAST insurance processing by smart phone. Scanning your ID card reduces the insurance review by 3 seconds. (Before 10 mins, after 3 seconds, the number of insured was over 150,000.)
4. Designed a front page DASHBOARD with entry points of frequent used features like 1 notice, 2 statistics reports, and 3 TODO lists, saving 20% time of customer service agents & farmers' association clerks, making their life easy.
5. Integrated POS and APP for business purposes.
Job description:
1. Drew the simulator prototype with Axure RP and provided high quality SPEC to team members to enhance communication efficiently.
2. Built up the related and useful system documents for communication clear and efficient.
3. Provided an appropriate solution for integration system.
4. Verified and tested APP functions with MS SQL and postman API platform.
5. System analyze and design for Web(.Net MVC + C#) and web app(React + node.js)
6. Designed and wrote data exchange specification by API.
skills:
MS SQL, Axure RP, System analysis and design
May 2016 - February 2017
Developed GUI with JavaFx
March 2012 - May 2016
8,071 likes, 7,977 followers
1. 15% increment sales every year for two years
2. Managed and analyzed Facebook fans page to launch social advertising campaigns
March 2010 - February 2012
Developed and maintained website with C#, MS SQL / SQL Server 2000
2005 - 2007
2001 - 2005
1. Design and develop ETL pipeline from extracting 3 tables into 1 loading table. According to the data cycle life is every business day, and its' refreshment mechanism is appended data automatically.
2. Data labeling with digital customers by SQL for open point rewards.
1. Design and develop ETL pipeline from extract one file and one table into one loading table. According to the data cycle life is every day, and its' refreshment mechanism is appended data automatically.
2. However, the life cycle of loading area is every business day, need a temporary table to storage not business day data.
Position as a system analyst where my professional knowledge in system analysis, data engineering, data analysis, language ability and personality in a company can contribute.
[email protected] 0900263385 Kaohsiung City, Taiwan
March 2023 - Present
Achievemnts:
1. Up the ETL execution efficiency to 50%.(from 5 hours to 2.5 hours).
2. Led APP team to change the data access method from 19 VIEWs to 9 APIs in 2 months.
3. Data Center Owner reviewed the ETL code and pipeline design and kept 95% good-quality for a whole year.
4. Not only built up Gitlab server and source code version control to the company but also code reviewed and defined released code version.
5. Using Python to develop an automatically inform function to monitor ETL jobs.
6. Plan the Data Analysis POC activity for selection the appropriate BI manufacturers to the company.
7. Clean up the data flow of the ODS and Data Warehouse, including the relationship between table and ETL jobs and data lineage.
8. Design a specific checklist of over 20 inspection items to review data or program quality then reduce 3 workdays * 3 people * $daily maintain cost * 107 deploy times * 95% good-quality running costs. (at least saved 6.7 million cost one year)
9. Passed Microsoft Certified; Azure Database Administrator Assoicate (Certificate number: EA5F98-6C2B4Y)
Job description:
1. The Big data platform owner, lead the BI-team to maintain the ETL pipelines and provide the stable and correct data service.
May 2019 - February 2023
Achievement:
1. Deal with over 2 billion records with hive SQL on Hadoop in one hour
2. Optimize SQL query of Hadoop, performance speed up 30% (more than 2 million records)
3. Design a checklist of over 30 inspection items to review data or program quality then reduce 5 workdays * 5 people running costs.
4. Release high quality data inventory / SPEC documents of over 300 tables in 2 years.
5. Introduce VS Code all in one replace multi putty-windows, git-bash, and ultra-editor to a convenient development environment.
6. 2022 KPI is top 40% in 22 people of a group.
Job description:
1. Design and execute ETL (shell, Informatica, SQL) pipeline solutions that ensure data loaded is precise and fast into data warehouse.
2. Verify data dependency is correct and consistent with data owner.
3. Design the data collection and refreshment mechanism for rerunning jobs to replace, append or upsert data automatically.
4. Create shell script to execute batch job automatically with IBM TWS (Task Work Scheduler).
5. Data migration of heterogeneous databases (ex: Teradata, MS SQL, Oracle, DB2, Hadoop).
6. Coach intern for ETL.
7. Complete Tableau data cleaning, data analysis and data visualization basic training.
Skills: Python, SQL, Git, Hadoop, Teradata, Shell scripts, Informatica
February 2017 - May 2019
Achievement:
1. Introduced Bureau of Labor Insurance over 200,000 data every 2 months to validate Insure state.
2. Integrated 6 banking account data formats, including 5 farmers' association Information Center and Agricultural Finance Bureau into system for dealing with insurance fee business.
3. Designed an EASY and FAST insurance processing by smart phone. Scanning your ID card reduces the insurance review by 3 seconds. (Before 10 mins, after 3 seconds, the number of insured was over 150,000.)
4. Designed a front page DASHBOARD with entry points of frequent used features like 1 notice, 2 statistics reports, and 3 TODO lists, saving 20% time of customer service agents & farmers' association clerks, making their life easy.
5. Integrated POS and APP for business purposes.
Job description:
1. Drew the simulator prototype with Axure RP and provided high quality SPEC to team members to enhance communication efficiently.
2. Built up the related and useful system documents for communication clear and efficient.
3. Provided an appropriate solution for integration system.
4. Verified and tested APP functions with MS SQL and postman API platform.
5. System analyze and design for Web(.Net MVC + C#) and web app(React + node.js)
6. Designed and wrote data exchange specification by API.
skills:
MS SQL, Axure RP, System analysis and design
May 2016 - February 2017
Developed GUI with JavaFx
March 2012 - May 2016
8,071 likes, 7,977 followers
1. 15% increment sales every year for two years
2. Managed and analyzed Facebook fans page to launch social advertising campaigns
March 2010 - February 2012
Developed and maintained website with C#, MS SQL / SQL Server 2000
2005 - 2007
2001 - 2005
1. Design and develop ETL pipeline from extracting 3 tables into 1 loading table. According to the data cycle life is every business day, and its' refreshment mechanism is appended data automatically.
2. Data labeling with digital customers by SQL for open point rewards.
1. Design and develop ETL pipeline from extract one file and one table into one loading table. According to the data cycle life is every day, and its' refreshment mechanism is appended data automatically.
2. However, the life cycle of loading area is every business day, need a temporary table to storage not business day data.