Cake Job Search

Advanced filters
Off
Logo of STARLUX Airlines 星宇航空.
本職缺開放報名時間:自2025年11月28日起至2025年12月14日止,屆時將視招募需求提前或延長職缺開放時間。 本職務將依實際專長分派工作內容,涵蓋資料建置、系統整合及專案管理三大面向: 1. 依據上下游系統資料交換需求釐清、收集資料、處理整合資料、分析數據、發掘問題 2. ETL開發與維護、日常系統維護與監控及問題排除 3. Oracle資料庫建置 3. 決策支援功能設計開發與管理 4. 提升資料轉置品質並協助調整可優化的程式及流程 5. 機密資料遮罩作業 5. 客戶主資料庫系統管理 6. 其他主管交辦事項 【注意事項】※ 應徵本職缺時,請於應徵問題處寫明欲應徵的系統/組別。※ 每位應試者僅能報名一項職缺,請詳閱工作內容後,審慎提出應徵申請。※ 若報名超過一項職缺者,本公司僅依最早投遞之職缺進行審核,並保留調整權利。※ 履歷初審合格者,將通知進行甄選面談,資格不符或未獲錄取者則不另行通知。
軟體程式設計
資料庫軟體應用
資料庫程式設計
Negotiable
1 years of experience required
No management responsibility
Logo of AUO Corporation 友達光電股份有限公司.
在友達,CIM工程師負責設計、開發和維護製造執行系統,工作任務包括系統開發、問題解決、系統優化和技術支持。CIM工程師的貢獻對友達至關重要,CIM工程師讓公司生產過程得以順利運作,並提高生產效率、減少成本。我們正在找尋樂於透過數據分析與機器學習,優化CIM系統的夥伴加入我們!【工作任務】 1. 智慧製造系統開發2. Manufacturing Execution System開發/維護建置3. UI/UX開發
Linux
Python
Java
Negotiable
1 years of experience required
No management responsibility
Logo of 核桃運算股份有限公司.
我們正在尋找一位具備 3-5 年經驗的資料工程師,加入我們的數據團隊。您將負責設計、構建並優化高效率的 Data Pipeline,確保資料從來源端到分析端的準確性與即時性。 此職位不僅需要強大的 Python 與 ETL 開發能力,更需要您對 資料來源標準(如 CKAN, SDMX)有深入理解,能夠以「分析師的視角」來設計資料架構,確保產出的數據能精準支持商業分析與決策。[職務內容] 1. Data Pipeline 開發與優化 (Core Engineering) 使用 Python 與 SQL 開發高效、可擴展的 ETL/ELT 流程。 運用 Apache Airflow 進行工作流排程、自動化管理與相依性設計。 針對不同資料來源(API, DB, Flat Files)進行研究與串接,如 WorldBank等開放資料或統計資料格式。 2. 資料品質與系統監控 (Quality Observability) 建立資料品質檢測機制(Data Quality Checks),確保數據的完整性、一致性與正確性。 設計並維護系統監控機制(Monitoring),在 Pipeline 失敗或資料異常時能自動告警,並進行 Root Cause Analysis (RCA) 與修復。 3. 資料架構與模型設計 (Architecture Modeling) 理解下游商業分析與報表需求,設計合適的 Data Model(資料模型),確保資料易於提取與分析。 協助維護 Data Dictionary(資料字典)與文件,提升資料的可觀測性與團隊協作效率。 4. 跨部門協作與技術支援 (Collaboration) 與業務部門及 Data Analyst 溝通,轉化商業需求為技術規格,並支援 Presales 與 POC (Proof of Concept) 的技術驗證。 資料領域知識: - 具備處理 JSON, XML, CSV 等多種資料格式的經驗。 -(關鍵) 理解開放資料標準或統計數據交換格式(如 Open Data API 串接或 SDMX 結構)者優先考慮。 - 對資料分析方法(如 BI 工具、統計分析)有基本概念,知道如何將 Raw Data 轉化為適合分析的結構 (Ready-for-analysis data)。 - 軟實力:具備良好的邏輯思維,能獨立解決複雜的資料異常問題,並具備撰寫高品質技術文件的能力。 5. 其他主管交辦事項
Linux,Git,Python,Data Modeling,ETL,MySQL,數據分析,作業系統基本操作,資料備份與復原,資料庫系統管理維護,商業智慧,專案溝通╱整合管理
40K ~ 55K TWD / month
3 years of experience required
No management responsibility
Logo of Pinkoi.
【關於 Pinkoi Data Squad】我們是 Pinkoi 的資料團隊,其目標為打造有效的數據平台,同時確保資料的品質與穩定性,讓資料在 Pinkoi 發揮最大的價值。Data Engineer 將與 Business Analyst/ML Engineer 緊密合作,透過數據探索問題、規劃後續的 Data Pipeline(資料管線) 實作,並提供 End-to-end 的 Data Solution,以擴展公司各部門的數據分析能力,並完善整體商業智慧系統。你需要負責的工作內容與產品、行銷、廣告等分析師與商業團隊合作,理解需求並規劃合適的 Data Pipeline。設計、實作與維護 ETL/ELT 流程,確保資料品質、穩定性與新鮮度。建立與優化資料模型,支援報表、分析與機器學習等多種場景。持續迭代並優化資料基礎建設,提升可擴展性與維運效率。建立監控與告警機制,確保 Pipeline 成功率與資料 SLA。我們希望你有的經驗跟特質對 Big Data 和 Data-driven engineering 有極大的熱情。瞭解 ETL 或 ELT 的流程,並具備 Data Modeling 跟開發 Data Pipeline 的能力。具備與產品 / 行銷團隊合作的經驗,並可與商業分析師、資料工程師,以及機器學習工程師協同合作。樂於學習最新的技術,隨時充實自我的專業技能。對大型語言模型(LLM)保持開放與好奇,願意學習並嘗試將其應用於資料工程工作(如:資料清理、欄位補全、文件生成、自動化測試),以探索新工具如何提升效率與使用者體驗。應徵條件數學 / 統計 / 資訊 / 資料科學相關科系畢業,或具備同等專業能力。一年以上數據分析或數據工程相關工作經驗。熟悉 Python、Scala、Java 等任何一種程式語言,Pinkoi 主要用 Python。熟悉 GNU/Linux 系統,Pinkoi 用 Ubuntu。熟悉關聯式資料庫的運用,例如 MySQL。加分條件熟悉廣告歸因邏輯,或具備相關數據處理經驗。具備使用者行為數據處理經驗。具有操作或架設 BI 工具的相關經驗。如 Superset、Tableau、Looker Studio,Pinkoi 使用 Superset 和 Redash。具有分散式運算的相關知識,例如 Spark、Hadoop、Hive 等一種或多種相關技術,Pinkoi 主要用 PySpark。熟悉雲端服務,例如 AWS Athena、S3、Glue,GCP BigQuery 等,Pinkoi 主要用 AWS。了解 dbt 的基礎架構和應用方式,如建立資料模型、撰寫測試、建立巨集等等。有將 LLM 應用於資料工程場景的經驗,例如規則抽取、欄位對齊與補全、文件/Data Catalog 生成、資料品質檢查等。#不用想了,趕快來當個 Pinkoi 人如果你有信心能夠勝任這份工作,歡迎提供你的個人履歷與小作業,即刻應徵!
800K ~ 1.5M TWD / year
No requirement for relevant working experience
Logo of 展旺數位有限公司.
・與產品、風控、運營團隊協作,針對業務需求定義資料模型與監控指標・建置 ETL 流程,收集與清洗來自平台的行為數據(如下注記錄、轉碼、點擊行為)・開發與維護異常偵測模型(如洗碼對打、機器人行為、套利用戶)・利用機器學習或統計模型預測玩家留存、LTV、流失風險・設計風控策略,提升平台資金與行爲風險控制能力・定期產出分析報告,提出可行的產品或營運優化建議
Spark
Redshift
Python
50K ~ 120K TWD / month
3 years of experience required
Managing staff numbers: not specified
Logo of Foreign Professional Talent Recruitment in Taiwan.
【公司描述】 我們是一家全球領先的數字資產交易與支付科技公司,致力於打造安全、高效的金融基礎設施。 目前,我們正在尋找資深數據工程師,負責獎勵與財務數據項目的開發與優化,與國際團隊合作,打造高效且可擴展的數據平台。 【Company Overview】 A global leader in digital asset exchange and payment technology, committed to building secure and efficient financial infrastructure. The company is now hiring a Senior Data Engineer to drive development and optimization of data systems for rewards and financial operations. This role involves close collaboration with international teams to build scalable and reliable data platforms. 【職位亮點】 ✅ 年薪 USD $60,000–80,000,具競爭力的薪資與成長空間 ✅ 國際化團隊,與全球頂尖技術專家合作 ✅ 參與高併發、即時數據處理專案,發展大規模數據基礎設施 【Role Highlights】 ✅ Competitive salary (USD $60K–80K) with strong career growth ✅ Join a global team and work with top engineers from around the world ✅ Build high-throughput, real-time data infrastructure at scale 【職位職責】 🔹 數據平台與管道開發:設計與維護穩定高效的數據架構 🔹 數據倉儲建置與優化:支援離線與即時數據的儲存與檢索 🔹 大數據處理與轉換:使用 Hadoop、Spark、Flink 等技術 🔹 跨部門協作:與財務、獎勵、技術團隊合作,提供數據解決方案 【Responsibilities】 🔹 Design and maintain stable, high-performance data platforms and pipelines 🔹 Develop and optimize offline and real-time data warehouses 🔹 Process and transform large data sets using Hadoop, Spark, or Flink 🔹 Collaborate cross-functionally with Finance, Rewards, and Engineering teams to deliver data-driven solutions 【職位要求】 ✔ 3–8 年數據平台或數據工程經驗 ✔ 精通 Hadoop、Spark、Flink 等大數據工具 ✔ 熟悉 Python / Scala / Java 至少一種程式語言 ✔ 熟悉數據倉儲與 ETL(離線與即時)開發經驗 ✔ 有 Airflow 經驗者佳 ✔ 英文流利,能與國際團隊順利溝通 【Requirements】 ✔ 3–8 years of experience in data platform or pipeline engineering ✔ Proficient in big data tools: Hadoop, Spark, Flink ✔ Strong programming skills in at least one of: Python, Scala, or Java ✔ Solid understanding of data warehousing and ETL (offline real-time) ✔ Experience with Airflow is a plus ✔ Fluent in English with strong communication skills for working across global teams---------------------------------------------------------------------- 若您對此職位感興趣,請立即聯繫,讓我為您介紹更多詳細資訊! ▶︎ 聯繫人:招募顧問 Owen Wu▶︎ LINE: https://line.me/ti/p/ICjS8Jx0Dq▶︎ LinkedIn: https://www.linkedin.com/in/owen-wu-1023a41a4/
Spark
Hadoop
Flink
60K ~ 80K USD / year
3 years of experience required
No management responsibility
Logo of 2026 台達年終轉職面談會|Leap onto the Trend! 跟台達躍進下一個世代.
我們正在尋找一位具備深厚AI技術背景與架構設計能力的資深AI架構師/工程師,能夠主導公司AI技術的發展方向,並在各個業務環節與產品中導入AI解決方案,提升整體營運效率與產品競爭力! 【工作內容】 • 擔任AI技術領導角色,制定AI技術藍圖與發展策略 • 主導AI架構設計,涵蓋資料處理、模型訓練、部署與監控 • 與產品、業務 等跨部門合作,發掘AI應用場景並提出創新解決方案 • 評估並導入最新AI技術(如生成式AI、LLM、Vision AI、AutoML等) • 指導團隊成員進行AI模型開發與優化,提升團隊技術能力 • 參與AI相關專案的技術評估、PoC、落地實施與效益分析 【必要條件】 • 資訊工程、電機、統計、數學或相關領域碩士以上學歷 • 5年以上AI/機器學習/深度學習相關工作經驗 • 熟悉主流AI框架(如 PyTorch、TensorFlow、Transformers、LangChain 等) • 具備大型AI系統架構設計與部署經驗(雲端/地端皆可) • 熟悉資料工程流程與工具(如 Spark、Airflow、Data Lake、ETL) • 具備MLOps實務經驗者佳(如 MLflow、Kubeflow、Docker、Kubernetes) • 具備良好溝通能力與跨部門協作經驗
Negotiable
5 years of experience required
No management responsibility
Logo of 2026 台達年終轉職面談會|Leap onto the Trend! 跟台達躍進下一個世代.
《Responsibilities》 o 資料庫架構設計與優化 1. 規劃並設計企業級資料庫架構,確保高可用性、高擴充性與高效能。 2. 主導資料庫系統(Oracle、MSSQL、PostgreSQL)的部署、遷移與整合。 3. 訂定資料庫標準與規範,包含命名規則、備份政策、資安要求等。 o 效能調校與維運 1. 監控與分析資料庫效能,進行 SQL 優化與索引調整。 2. 建立備份與災難復原策略,確保資料完整性與業務連續性。 3. 解決複雜的資料庫問題,持續優化資源使用與效能瓶頸。 o 專案支援與跨部門協作 1. 領導或參與資料庫移轉相關專案(例如:資料庫升級、遷移、異地備援)。 2. 支援應用系統開發團隊,提供資料庫設計與最佳實務建議。 3. 與基礎架構、資安與相關團隊協作,確保資料庫SLA 和合規要求。 _________________________________________________________________ 《Qualifications》 1. 深入了解並具備實務經驗:Oracle、Microsoft SQL Server、PostgreSQL。 2. 熟悉資料庫叢集、Always On、RAC、Streaming Replication 等高可用性解決方案。 3. 具備 SQL 效能調校經驗,能處理大型交易系統與分析系統的效能瓶頸。 4. 熟悉 ETL、資料同步與跨平台資料整合方法。 5. 了解資料中心基礎設施與儲存架構,能規劃資料庫與硬體資源最佳化。 6. 熟悉資料庫安全性管理,包含帳號權限、加密、防火牆與審計。
Negotiable
5 years of experience required
No management responsibility
Logo of Logitech.
Logitech is the Sweet Spot for people who want their actions to have a positive global impact while having the flexibility to do it in their own way.The Role:We are looking for a candidate to join our team who will be involved in the ongoing development of our Enterprise Data Warehouse (EDW) The Media and Marketing Data Enginee role will include participating in the loading and extraction of data, including external sources through API , Storage buckets( S3,Blob storage) and  marketing specific data integrations. The ideal candidate will be involved in all stages of the project lifecycle, from initial planning through to deployment in production. A key focus of the role will be data analysis, data modeling, and ensuring these aspects are successfully implemented in the production environment.Your Contribution:Be Yourself. Be Open. Stay Hungry and Humble. Collaborate. Challenge. Decide and just Do. These are the behaviors you’ll need for success at Logitech. In this role you will:Design, Develop, document, and test ETL solutions using industry standard tools.Ability to design Physical and Reporting Data models for seamless cross-functional and cross-systems data reporting.Enhance point-of-sale datasets with additional data points to provide stakeholders with useful insights.Ensure data integrity by rigorously validating and reconciling data obtained from third-party providers.Collaborate with data providers and internal teams to address customer data discrepancies and enhance data quality.Work closely across our DI teams to deliver datasets optimized for consumption in reporting and visualization tools like TableauCollaborate with data architects, analysts, and business stakeholders to gather requirements and translate them into data solutions.​​​​Participate in the design discussion with enterprise architects and recommend design improvementsDevelop and maintain conceptual, logical, and physical data models with their corresponding metadata.Work closely with cross-functional teams to integrate data solutions.Create and maintain clear documentation for data processes, data models, and pipelines. Integrate Snowflake with various data sources and third-party tools.Manage code versioning and deployment of Snowflake objects using CI/CD practicesKey Qualifications:For consideration, you must bring the following minimum skills and behaviors to our team:A total of 6 to 10 years of experience in ETL design, development, and populating data warehouses. This includes experience with heterogeneous OLTP sources such as Oracle R12 ERP systems and other cloud technologies.At least 3 years of hands-on experience with Pentaho Data Integration or similar ETL tools.Practical experience working with cloud-based Data Warehouses such as Snowflake and Redshift.Significant hands-on experience with Snowflake utilities, including SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, Snowflake AI/ML and stored procedures.Worked on API based integrations and marketing dataDesign and develop complex data pipelines and ETL workflows in Snowflake using advanced SQL, UDFs, UDTFs, and stored procedures (JavaScript/SQL).Comprehensive expertise in databases, data acquisition, ETL strategies, and the tools and technologies within Pentaho DI and Snowflake.Experience in working with complex SQL Functions Transformation of data on large data sets.Demonstrated experience in designing complex ETL processes for extracting data from various sources, including XML files, JSON, RDBMS, and flat files.Exposure to standard support ticket management tools.A strong understanding of Business Intelligence and Data warehousing concepts and methodologies.Extensive experience in data analysis and root cause analysis, along with proven problem-solving and analytical thinking capabilities.A solid understanding of software engineering principles and proficiency in working with Unix/Linux/Windows operating systems, version control, and office software.A deep understanding of data warehousing principles and cloud architecture, including SQL optimization techniques for building efficient and scalable data systems.Familiarity with Snowflake’s unique features, such as its multi-cluster architecture and shareable data capabilities.Excellent skills in writing and optimizing SQL queries to ensure high performance and data accuracy across all systems.The ability to troubleshoot and resolve data quality issues promptly, maintaining data integrity and reliability. Strong communication skills are essential for effective collaboration with both technical and non-technical teams to ensure a clear understanding of data engineering requirements.In addition, preferable skills and behaviors include:Exposure to Oracle ERP environment,Basic understanding of Reporting tools like OBIEE, TableauExposure to Marketing data platform like Adverity,Fivertran etcExposure to Customer data platformEducation:BS/BTech/MCA/MS in computer science  Information Systems or a related technical field or equivalent industry expertise.Fluency in English.Logitech is the sweet spot for people who are passionate about products, making a mark, and having fun doing it. As a company, we’re small and flexible enough for every person to take initiative and make things happen. But we’re big enough in our portfolio, and reach, for those actions to have a global impact. That’s a pretty sweet spot to be in and we’re always striving to keep it that way.  “All qualified applicants will receive consideration for employment without regard to race, sex, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability.” Across Logitech we empower collaboration and foster play. We help teams collaborate/learn from anywhere, without compromising on productivity or continuity so it should be no surprise that most of our jobs are open to work from home from most locations. Our hybrid work model allows some employees to work remotely while others work on-premises. Within this structure, you may have teams or departments split between working remotely and working in-house.Logitech is an amazing place to work because it is full of authentic people who are inclusive by nature as well as by design. Being a global company, we value our diversity and celebrate all our differences. Don’t meet every single requirement? Not a problem. If you feel you are the right candidate for the opportunity, we strongly recommend that you apply. We want to meet you!We offer comprehensive and competitive benefits packages and working environments that are designed to be flexible and help you to care for yourself and your loved ones, now and in the future. We believe that good health means more than getting medical care when you need it. Logitech supports a culture that encourages individuals to achieve good physical, financial, emotional, intellectual and social wellbeing so we all can create, achieve and enjoy more and support our families. We can’t wait to tell you more about them being that there are too many to list here and they vary based on location.All qualified applicants will receive consideration for employment without regard to race, sex, age, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability.If you require an accommodation to complete any part of the application process, are limited in the ability, are unable to access or use this online application process and need an alternative method for applying, you may contact us toll free at 1-510-713-4866 for assistance and we will get back to you as soon as possible.
Negotiable
No requirement for relevant working experience
Logo of Logitech.
Logitech is the Sweet Spot for people who want their actions to have a positive global impact while having the flexibility to do it in their own way.The Role:We are looking for a candidate to join our team who will be involved in the ongoing development of our Enterprise Data Warehouse (EDW) and supporting our POS [Point of Sale] Channel Data Management team. This role will include participating in the loading and extraction of data, including POS to and from the warehouse. The ideal candidate will be involved in all stages of the project lifecycle, from initial planning through to deployment in production. A key focus of the role will be data analysis, data modeling, and ensuring these aspects are successfully implemented in the production environment.  Your Contribution:Be Yourself. Be Open. Stay Hungry and Humble. Collaborate. Challenge. Decide and just Do. These are the behaviors you’ll need for success at Logitech. In this role you will: Design, Develop, document, and test ETL solutions using industry standard tools.Ability to design Physical and Reporting Data models for seamless cross-functional and cross-systems data reporting.Enhance point-of-sale datasets with additional data points to provide stakeholders with useful insights.Ensure data integrity by rigorously validating and reconciling data obtained from third-party providers.Collaborate with data providers and internal teams to address customer data discrepancies and enhance data quality.Work closely across our DI teams to deliver datasets optimized for consumption in reporting and visualization tools like TableauCollaborate with data architects, analysts, and business stakeholders to gather requirements and translate them into data solutions.​​​​Participate in the design discussion with enterprise architects and recommend design improvementsDevelop and maintain conceptual, logical, and physical data models with their corresponding metadata.Work closely with cross-functional teams to integrate data solutions.Create and maintain clear documentation for data processes, data models, and pipelines. Integrate Snowflake with various data sources and third-party tools.Manage code versioning and deployment of Snowflake objects using CI/CD practices Key Qualifications:For consideration, you must bring the following minimum skills and behaviors to our team:A total of 6 to 10 years of experience in ETL design, development, and populating data warehouses. This includes experience with heterogeneous OLTP sources such as Oracle R12 ERP systems and other cloud technologies.At least 3 years of hands-on experience with Pentaho Data Integration or similar ETL tools.Practical experience working with cloud-based Data Warehouses such as Snowflake and Redshift.Significant hands-on experience with Snowflake utilities, including SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, Snowflake AI/ML and stored procedures.Design and develop complex data pipelines and ETL workflows in Snowflake using advanced SQL, UDFs, UDTFs, and stored procedures (JavaScript/SQL).Comprehensive expertise in databases, data acquisition, ETL strategies, and the tools and technologies within Pentaho DI and Snowflake.Experience in working with complex SQL Functions Transformation of data on large data sets.Demonstrated experience in designing complex ETL processes for extracting data from various sources, including XML files, JSON, RDBMS, and flat files.Exposure to standard support ticket management tools.A strong understanding of Business Intelligence and Data warehousing concepts and methodologies.Extensive experience in data analysis and root cause analysis, along with proven problem-solving and analytical thinking capabilities.A solid understanding of software engineering principles and proficiency in working with Unix/Linux/Windows operating systems, version control, and office software.A deep understanding of data warehousing principles and cloud architecture, including SQL optimization techniques for building efficient and scalable data systems.Familiarity with Snowflake’s unique features, such as its multi-cluster architecture and shareable data capabilities.Excellent skills in writing and optimizing SQL queries to ensure high performance and data accuracy across all systems.The ability to troubleshoot and resolve data quality issues promptly, maintaining data integrity and reliability. Strong communication skills are essential for effective collaboration with both technical and non-technical teams to ensure a clear understanding of data engineering requirements. In addition, preferable skills and behaviors include:Exposure to Oracle ERP environment,Basic understanding of Reporting tools like OBIEE, Tableau Education:BS/BTech/MCA/MS in computer science  Information Systems or a related technical field or equivalent industry expertise.Fluency in English. Logitech is the sweet spot for people who are passionate about products, making a mark, and having fun doing it. As a company, we’re small and flexible enough for every person to take initiative and make things happen. But we’re big enough in our portfolio, and reach, for those actions to have a global impact. That’s a pretty sweet spot to be in and we’re always striving to keep it that way.  “All qualified applicants will receive consideration for employment without regard to race, sex, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability.” Across Logitech we empower collaboration and foster play. We help teams collaborate/learn from anywhere, without compromising on productivity or continuity so it should be no surprise that most of our jobs are open to work from home from most locations. Our hybrid work model allows some employees to work remotely while others work on-premises. Within this structure, you may have teams or departments split between working remotely and working in-house.Logitech is an amazing place to work because it is full of authentic people who are inclusive by nature as well as by design. Being a global company, we value our diversity and celebrate all our differences. Don’t meet every single requirement? Not a problem. If you feel you are the right candidate for the opportunity, we strongly recommend that you apply. We want to meet you!We offer comprehensive and competitive benefits packages and working environments that are designed to be flexible and help you to care for yourself and your loved ones, now and in the future. We believe that good health means more than getting medical care when you need it. Logitech supports a culture that encourages individuals to achieve good physical, financial, emotional, intellectual and social wellbeing so we all can create, achieve and enjoy more and support our families. We can’t wait to tell you more about them being that there are too many to list here and they vary based on location.All qualified applicants will receive consideration for employment without regard to race, sex, age, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability.If you require an accommodation to complete any part of the application process, are limited in the ability, are unable to access or use this online application process and need an alternative method for applying, you may contact us toll free at 1-510-713-4866 for assistance and we will get back to you as soon as possible.
Negotiable
No requirement for relevant working experience

Cake Job Search

Join Cake now! Search tens of thousands of job listings to find your perfect job.