Fulltime Data Engineers openings in California on September 01, 2022

Principal Data Engineer at Cornerstone On Demand

Location: San Francisco

Principal Data Engineer

San Francisco Bay Area

We are seeking a talented Principal Data Engineer in the San Francisco Bay Area, CA.

The Principal Data Engineer is part of the Business Intelligence team that is responsible for analytics and integrations supported by IT across Sales, Marketing, Services, Customer Success, Finance, HR, Product and Engineering. The right candidate has empathy, curiosity and desire to improve and constantly learn; has technical experience in all the capabilities of cloud based data warehouse (AWS Redshift or Snowflake) and cloud based ETL/ELT tool (Matillion or equivalent) and a proven track record of driving best practices and processes, with hands-on experience in developing ETL projects, configuration, automation, & support. This individual will: a) Be a key member of the BI team within a dynamic SaaS company full of high caliber engineers; b) Design and build performant ETL projects working independently and in partnership with a global team of PMs, business analysts, and developers (onsite and offshore); c) Provide vision for enhancements, technical design, and participate in design reviews to ensure governance principles and best practices are followed;

In this role you will

Lead and execute the complete ETL development lifecycle, including requirements gathering, solution design, documentation, configuration, development, testing, and deployment; take ownership of escalated or complex issues and follow through to resolution.

Create clear, structured and effective documentation that will enable developers within the team to maintain ETL.

Oversee and ensure execution of multi-year strategies, projects, and ensure adherence to technology standards and best practices.

Define and document current and future state BI architectures, frameworks and roadmaps.

Proactively seek opportunities for process improvement: identify and resolve process bottlenecks, data issues, and inconsistencies to help improve operational efficiencies.

Suggest improvements inspired by ideas from the global Matillion community and the broader data engineering community.

Follow Agile principles to projects. Conduct regular knowledge sharing sessions with developers to drive adoption of Matillion (or equivalent ETL) based integrations

Implement best practice and architecture recommendations to leadership

Perform change management functions associated with large scale high-impact configurations

Create, manage and enforce application configuration and SDLC standards.

Hands-on role (100%) – working on critical ETL fixes, design, development and QA.

Focus on Back End systems (Matillion or equivalent ETL, AWS infrastructure, Redshift/Snowflake).

Provide BI administration and technical support during weekends, after-hours and holidays, when needed.

Work on new BI projects, enhancements as well as production support.

You’ve got what it takes if you have

10+ years ETL development experience with Data warehouse, Data engineering, or similar roles working with Back End systems and large, disparate, and complex datasets. Must have 7+ years of experience with Matillion or tools like Informatica/Talend/Pentaho.

Good understanding and experience in AWS Cloud infrastructure Platform, Redshift and Snowflake (Highly Preferable) .

Develop and maintain scalable data pipelines and build new API integrations to support continuing increases in data volume and complexity.

Very good SQL and Python knowledge.

Clear understanding of data warehousing concepts such as Change Data Capture, Slowly Changing Dimensions and experience in practical implementation in ETL.

Experience using version control, Agile Project Management and Ticketing applications – GIT, JIRA, Zendesk.

Understanding and experience designing resilient ETL. Ability to anticipate issues and build exception handling processes.

Experience in data engineering, dimensional database design, data lake, and data warehouse.

Experience in practicing agile methodologies, including scrum and continuous integration environments.

Experience leading technical requirements gathering and building solutions from those requirements; working at-scale by standardizing components to ship visualizations rapidly.

Knowledge of Enterprise Business Applications – Salesforce products (Sales Cloud, Community Cloud, CPQ), Zuora, Oracle Cloud ERP, Coupa, NetSuite, Kyriba.

BTech/MCA degree in Computer Science, similar technical field of study or equivalent practical experience.

Demonstrated commitment to valuing diversity and contributing to an inclusive working and learning environment.

Consideration for privacy and security obligations.

Extra dose of awesome if you have

Matillion Building A Data Warehouse or equivalent ETL certification.

Experience with Tableau development, configuration and system administration or related reporting tools.

Experience with CPQ, & quote to cash processes in a SaaS/recurring revenue company.

Ability to work seamlessly as part of a multi-site, multicultural, development and testing team, onshore and offshore, internal and external resources.

Our Culture:

Our mission is to empower people, businesses and communities. A culture created less by what we do and more by who we are. When people ask what our team is about, we point to our core values: champion customer success, bring our best, achieve together, get stuff done, and innovate every day. Were always on the lookout for new, curious and capable people who can help us achieve our goal and we are seeking diversity in the people who join our team. We want to make sure that our company reflects the demographic of our customers, clients, and the communities in which we operate. So if you want to work for a friendly, global, inclusive and innovative company, wed love to meet you!

What We Do:

Cornerstone is a premier people development company. We believe people can achieve anything when they have the right development and growth opportunities. We offer organizations the technology, content, expertise and specialized focus to help them realize the potential of their people. Featuring comprehensive recruiting, personalized learning, modern training content, development-driven performance management and holistic employee data management and insights, Cornerstone’s people development solutions are successfully used by more than 75 million people in 180+ countries and in nearly 50 languages.

Cornerstone takes special care to ensure the security and privacy of the data of its users.
Apply Here
For Remote Principal Data Engineer roles, visit Remote Principal Data Engineer Roles

********

Data Engineer at SonicJobs

Location: California City

Job Description: :

Strong in SQL and Python

Data warehousing concepts

Good to have to big data technologies – Apache Spark, Kafka, Delta Lake

Job Description: :

Strong in SQL and Python

Data warehousing concepts

Good to have to big data technologies – Apache Spark, Kafka, Delta Lake
Apply Here
For Remote Data Engineer roles, visit Remote Data Engineer Roles

********

Principal Data Engineer – Remote at LVMH Fashion Group USA

Location: Los Angeles

Our technology team works fast and smart. With San Francisco as our home, we take bringing new tech to market seriously, developing the latest in mobile technologies, scalable architecture, and the coolest in-store client experience. We love what we do and we have fun doing it. The Technology group is comprised of motivated self-starters and true team players that are absolutely integral to the growth of Sephora and our future success.

Your role at Sephora:

As a Principal Software Engineer you will design and implement innovative analytical solutions and work alongside the product engineering team, evaluating new features and architecture. Reporting to the Engineering Manager, Data Platform, you will work closely with other team members like data architects and business analysts to understand what the business is trying to achieve, move data from source to target, and design optimal data models. You will be also responsible for building and maintaining the data platform. This hands-on technical role demands excellent knowledge and can demonstrate best practices in the industry. Come be a part of a team that is starting this new journey.

Responsibilities
• Design and Build Enterprise Analytical solutions using DataBricks, Azure stack of technologies.
• Solid Experience with programming languages such as Scala (or Java)
• Build and scale data infrastructure that powers batch and Real Time data processing
• Streamline the intake of the raw data into our Data lake.
• Develop ETL and implement best practices for ETL development
• Work effectively using scrum with multiple team members to deliver analytical solutions to the business functions.
• Perform production support and deployment activities

We’re excited about you if you have:
• 8 – 10 years of experience with large scale data warehouse projects
• BS in Computer Science or equivalent is required
• Solid Experience with programming languages such as Scala (or Java)
• Expert experience with any of the ETL tools (Informatica, Pentaho, Data Stage, etc )
• Prior experience working with Cloud technologies preferably Azure
• Preferred experience with data integration tools
• Prior experience working with Retail/CRM/Finance datasets
• Very comfortable in designing facts, dimensions, snapshots, SCDs, etc
• UNIX/Linux experience and Scripting skills (shell, Perl, Python, etc.)
• Write complex SQL for processing raw data, data validation and QA
• Experience working with APIs to collect or ingest data.
• Degree in computer science, engineering, MIS, or related field.
• Extensive knowledge and understanding of JavaScript.
• Experience with JavaScript libraries (eg ExtJS, Backbone JS, and Angular JS).
• Strong Database knowledge, COSMOS DB & MySQL preferred
• Communication Skills Data Engineers are part of a team, working with database administrators, data analysts and management and need to be effective communicators.
• Attention to Detail Databases are complex, and a minute error can cause huge problems.
• Problem-Solving Skills Data Engineers look at an issue that needs to be solved and come up with solutions quickly.
• Growth Learner desire to continue to learn about where the BI industry is going and cloud technology.

You’ll love working here because:
• The people. You will be surrounded by some of the most talented, supportive, smart, and kind leaders and teams – people you can be proud to work with.
• The product. Employees enjoy a product discount and receive free product ( gratis ) various times throughout the year. (Think your friends and family love you now? Just wait until you work at Sephora!)
• The business. It feels good to win – and Sephora is a leader in the retail industry, defining experiential retail with a digital focus and creating the most loved beauty community in the world with the awards and accolades to back it up.
• The perks. Sephora offers comprehensive medical benefits, generous vacation/holiday time off, commuter benefits, and Summer Fridays (half-days every Friday between Memorial and Labor Day) and so much more.
• The LVMH family. Sephora’s parent company, LVMH, is one of the largest luxury groups in the world, providing support to over 70 brands such as Louis Vuitton, Celine, Marc Jacobs, and Dior.

Working at Sephora’s Field Support Center (FSC)

Our North American operations are based in the heart of San Francisco’s Financial District, but you won’t hear us call it a headquarters – it’s the Field Support Center (FSC). At the FSC, we support our stores in providing the best possible experience for every client. Dedicated teams cater to our client’s every need by creating covetable assortments, curated content, compelling storytelling, smart strategy, skillful analysis, expert training, and more. It takes a lot of curious and confident individuals, disrupting the status quo and taking chances. The pace is fast, the fun is furious, and the passion is real. We never rest on our laurels. Our motto? If it’s not broken, fix it.

Sephora is an equal opportunity employer and values diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, ancestry, citizenship, gender, gender identity, sexual orientation, age, marital status, military/veteran status, or disability status. Sephora is committed to working with and providing reasonable accommodation to applicants with physical and mental disabilities.

Sephora will consider for employment all qualified applicants with criminal histories in a manner consistent with applicable law.

COMPANY OVERVIEW:

SEPHORA has been changing the face of prestige cosmetics since its debut in 1970s Paris. Sephora was acquired by luxury group Moët Hennessy Louis Vuitton (LVMH) in 1997 then launched stateside in 1998, and is currently home to 200 world-class brands – including its own private label, SEPHORA COLLECTION. Sephoras curated assortment features more than 14,000 products including makeup, skin care, perfume, hair care, body, professional tools and more. Sephora is the beauty education hub, offering consultations at the Beauty Studio, a variety of complimentary classes, one-on-one service from Personal Beauty Advisors, and exclusive retail technology SKINCARE IQ, COLOR IQ, and FRAGRANCE IQ. Sephora is an international force in beauty, and its award-winning website and ever-growing presence on social-media make it the worlds premier digital beauty destination.
Apply Here
For Remote Principal Data Engineer – Remote roles, visit Remote Principal Data Engineer – Remote Roles

********

Senior Data Engineer at SPACEX

Location: Mountain View

SpaceX was founded under the belief that a future where humanity is out exploring the stars is fundamentally more exciting than one where we are not. Today SpaceX is actively developing the technologies to make this possible, with the ultimate goal of enabling human life on Mars.

paceX is looking for a Sr. Data Engineer to drive data analysis and monitoring for the Starlink network, with the goal of providing better Internet access to unconnected users worldwide. You will set best practices for how to use our data to direct developer efforts, find and solve network inefficiencies, create and drive KPIs for network quality, and solve the networks biggest problems. The tools you build will allow Starlink to expand its user base, improve its user experience, and serve unconnected populations across the globe.

RESPONSIBILITIES:
• Define and create Real Time and historical metrics, dashboards, and KPIs to monitor network performance, outages, and regressions
• Use data analytics to isolate performance bottlenecks in reliability, throughput and latency
• Onboard other teams at Starlink to be able to create their own monitoring dashes, using a common toolset
• Bring machine learning into our toolkit: ML models to predict failures, anomaly detection

BASIC QUALIFICATIONS:
• Bachelor’s degree in computer science, data science, physics, mathematics, or a STEM discipline
• 5+ years professional experience in analytics, data science, data engineering, or software engineering
• 5+ years professional development experience with SQL, Python, Spark, R, or other programming languages

PREFERRED SKILLS AND EXPERIENCE:
• 5+ years professional experience building predictive models and machine learning pipelines (clustering analysis, failure prediction, anomaly detection)
• 5+ years professional experience in custom ETL design, implementation and maintenance
• Professional experience with schema design and dimensional data modeling
• Professional experience working in a Linux environment, and open source tools
• Professional experience working with in-stream data processing of structured and semi-structured data
• Experience handling large (TB+) datasets
• Domain-specific experience a plus, but not required
• Demonstrated ability to own projects from start to completion
• Strong attention to detail

ITAR REQUIREMENTS:

To conform to U.S. Government space technology export regulations, including the International Traffic in Arms Regulations (ITAR) you must be a U.S. citizen, lawful permanent resident of the U.S., protected individual as defined by 8 U.S.C. 1324b(a)(3), or eligible to obtain the required authorizations from the U.S. Department of State. Learn more about the ITAR here.

SpaceX is an Equal Opportunity Employer; employment with SpaceX is governed on the basis of merit, competence and qualifications and will not be influenced in any manner by race, color, religion, gender, national origin/ethnicity, veteran status, disability status, age, sexual orientation, gender identity, marital status, mental or physical disability or any other legally protected status.

Applicants wishing to view a copy of SpaceX’s Affirmative Action Plan for veterans and individuals with disabilities, or applicants requiring reasonable accommodation to the application/interview process should notify the Human Resources Department.
Apply Here
For Remote Senior Data Engineer roles, visit Remote Senior Data Engineer Roles

********

Data Engineer (IMO) at Genentech

Location: San Francisco

The Position

The candidate will join the Information Management Office (IMO) team to build and deploy data solutions for clinical trial operations, data science/predictive capabilities, and analytics/business intelligence tools. Serves as the IMO data engineer rep on ECD programs with minimum oversight.

Assist in communicating technical concepts to business stakeholders as well as communicate any gaps to the technical team. Understand how work fits into the larger project and identify problems with requirements and communicate to IMO leadership. Participates in roadmap and strategy development discussions. Provide input on project estimations, specifications and any on-going issues that may negatively impact the project deliverables. Partner closely with project managers, technology and business teams to evaluate and provide engineering solutions to their needs. Work closely with other Infra/DevOps, data engineers, data analysts and Data Scientists in the team for delivering high quality solutions. Lead solution architecture with minimal supervision. Provide L1 & L2 support for all data engineering tickets by maintaining the agreed upon SLAs, RTOs and uptime goals. Engage senior cloud engineers/leads and vendor support teams in an event of escalation.

Required Skills
• 3 years+ Experience with Data Engineering in Cloud Data Solutions (AWS preferred)
• 5 years+ Experience building Data Platforms, Datalakes, Modern Datawarehouse architectures and Self-service Business Intelligence solutions
• Expertise in designing efficient Data Models, optimizing existing Data Marts, developing and deploying Data structures based on those Data Models
• Expertise in designing and implementing Data security to ensure the compliance of all the data assets and analytical applications
• 3 years+ Experience in SQL, Relational databases
• 5 years+ Extensive experience with data processing and ETL/ELT techniques
• 2 years+ Experience developing and supporting scalable data pipelines using technologies such as Kafka, Spark, Airflow to support Batch and streaming data efficiently
• 3 years+ Python programming experience.
• Experience with high performance distributed data computing.
• Experience with good software development, automation practices, including collaborative development using DevOps pipelines.
• Build processes supporting data transformation, data structures, metadata, dependency and workload management.
• Excellent communication, advanced English reading, writing, listening and speaking skills.

Desired Skills:
• 1 year+ Experience in Data Visualization tools such as Tableau, Power BI etc.
• Knowledge in Graph and NoSQL databases
• Previous experience with Informatica, Talend tools
• Exposure to Data Science Technologies and Capabilities

Genentech is an equal opportunity employer, and we embrace the increasingly diverse world around us. Genentech prohibits unlawful discrimination based on race, color, religion, gender, sexual orientation, gender identity or expression, national origin or ancestry, age, disability, marital status and veteran status.

Genentech requires all new hires to be fully vaccinated against COVID-19 as of their start date. This requirement is a condition of employment at Genentech, and it applies regardless of whether the position is located at a Genentech campus or is fully remote. If you are unable to receive the vaccine due to a disability or serious medical condition, or because it is prohibited as a result of your sincerely held religious beliefs, you will have an opportunity to request a reasonable accommodation.
Apply Here
For Remote Data Engineer (IMO) roles, visit Remote Data Engineer (IMO) Roles

********

Principal Data Engineer – Remote at LVMH Fashion Group USA

Location: San Diego

Our technology team works fast and smart. With San Francisco as our home, we take bringing new tech to market seriously, developing the latest in mobile technologies, scalable architecture, and the coolest in-store client experience. We love what we do and we have fun doing it. The Technology group is comprised of motivated self-starters and true team players that are absolutely integral to the growth of Sephora and our future success.

Your role at Sephora:

As a Principal Software Engineer you will design and implement innovative analytical solutions and work alongside the product engineering team, evaluating new features and architecture. Reporting to the Engineering Manager, Data Platform, you will work closely with other team members like data architects and business analysts to understand what the business is trying to achieve, move data from source to target, and design optimal data models. You will be also responsible for building and maintaining the data platform. This hands-on technical role demands excellent knowledge and can demonstrate best practices in the industry. Come be a part of a team that is starting this new journey.

Responsibilities
• Design and Build Enterprise Analytical solutions using DataBricks, Azure stack of technologies.
• Solid Experience with programming languages such as Scala (or Java)
• Build and scale data infrastructure that powers batch and Real Time data processing
• Streamline the intake of the raw data into our Data lake.
• Develop ETL and implement best practices for ETL development
• Work effectively using scrum with multiple team members to deliver analytical solutions to the business functions.
• Perform production support and deployment activities

We’re excited about you if you have:
• 8 – 10 years of experience with large scale data warehouse projects
• BS in Computer Science or equivalent is required
• Solid Experience with programming languages such as Scala (or Java)
• Expert experience with any of the ETL tools (Informatica, Pentaho, Data Stage, etc )
• Prior experience working with Cloud technologies preferably Azure
• Preferred experience with data integration tools
• Prior experience working with Retail/CRM/Finance datasets
• Very comfortable in designing facts, dimensions, snapshots, SCDs, etc
• UNIX/Linux experience and Scripting skills (shell, Perl, Python, etc.)
• Write complex SQL for processing raw data, data validation and QA
• Experience working with APIs to collect or ingest data.
• Degree in computer science, engineering, MIS, or related field.
• Extensive knowledge and understanding of JavaScript.
• Experience with JavaScript libraries (eg ExtJS, Backbone JS, and Angular JS).
• Strong Database knowledge, COSMOS DB & MySQL preferred
• Communication Skills Data Engineers are part of a team, working with database administrators, data analysts and management and need to be effective communicators.
• Attention to Detail Databases are complex, and a minute error can cause huge problems.
• Problem-Solving Skills Data Engineers look at an issue that needs to be solved and come up with solutions quickly.
• Growth Learner desire to continue to learn about where the BI industry is going and cloud technology.

You’ll love working here because:
• The people. You will be surrounded by some of the most talented, supportive, smart, and kind leaders and teams – people you can be proud to work with.
• The product. Employees enjoy a product discount and receive free product ( gratis ) various times throughout the year. (Think your friends and family love you now? Just wait until you work at Sephora!)
• The business. It feels good to win – and Sephora is a leader in the retail industry, defining experiential retail with a digital focus and creating the most loved beauty community in the world with the awards and accolades to back it up.
• The perks. Sephora offers comprehensive medical benefits, generous vacation/holiday time off, commuter benefits, and Summer Fridays (half-days every Friday between Memorial and Labor Day) and so much more.
• The LVMH family. Sephora’s parent company, LVMH, is one of the largest luxury groups in the world, providing support to over 70 brands such as Louis Vuitton, Celine, Marc Jacobs, and Dior.

Working at Sephora’s Field Support Center (FSC)

Our North American operations are based in the heart of San Francisco’s Financial District, but you won’t hear us call it a headquarters – it’s the Field Support Center (FSC). At the FSC, we support our stores in providing the best possible experience for every client. Dedicated teams cater to our client’s every need by creating covetable assortments, curated content, compelling storytelling, smart strategy, skillful analysis, expert training, and more. It takes a lot of curious and confident individuals, disrupting the status quo and taking chances. The pace is fast, the fun is furious, and the passion is real. We never rest on our laurels. Our motto? If it’s not broken, fix it.

Sephora is an equal opportunity employer and values diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, ancestry, citizenship, gender, gender identity, sexual orientation, age, marital status, military/veteran status, or disability status. Sephora is committed to working with and providing reasonable accommodation to applicants with physical and mental disabilities.

Sephora will consider for employment all qualified applicants with criminal histories in a manner consistent with applicable law.

COMPANY OVERVIEW:

SEPHORA has been changing the face of prestige cosmetics since its debut in 1970s Paris. Sephora was acquired by luxury group Moët Hennessy Louis Vuitton (LVMH) in 1997 then launched stateside in 1998, and is currently home to 200 world-class brands – including its own private label, SEPHORA COLLECTION. Sephoras curated assortment features more than 14,000 products including makeup, skin care, perfume, hair care, body, professional tools and more. Sephora is the beauty education hub, offering consultations at the Beauty Studio, a variety of complimentary classes, one-on-one service from Personal Beauty Advisors, and exclusive retail technology SKINCARE IQ, COLOR IQ, and FRAGRANCE IQ. Sephora is an international force in beauty, and its award-winning website and ever-growing presence on social-media make it the worlds premier digital beauty destination.
Apply Here
For Remote Principal Data Engineer – Remote roles, visit Remote Principal Data Engineer – Remote Roles

********

Big Data Engineer at Magnite

Location: Los Angeles

Join our team as a Big Data Engineer!
We are looking for an entry level Big Data Engineer to join our experienced team of engineers that maintain and load data to our Petabyte sized data lake and data warehouse in Snowflake. We have these systems running in the cloud and on prem in multiple geographies 24/7. The ideal candidate would be interested in expanding their skills in these areas while also taking on daily support needs in concert with the rest of our team.
We want someone who is unafraid of taking on new challenges and willing to learn new things. The work of team members on the data platforms team can vary widely. Ideal candidates will be willing to tackle anything in data engineering. In addition the big data engineer will need to have good communication skills to provide clear status on issues when they arise.
As A Data Platform Engineer You Will
• Write production programs in PySpark to support our data pipeline.
• Write production ready scripts in Python and Bash to automate the management of our systems.
• Respond to user requests which include troubleshooting errors and performance issues, ad-hoc requests, etc..
• Work collaboratively with the data platform engineering team and external teams to continuously improve our operations.
• Support operations of our systems in production.
• Be a creative problem solver who can draw on experience to triage data platform issues in a timely manner without compromising reliability and performance of the entire system.

Our Ideal Candidate Will Have
• Our ideal candidate will have 1+ years of experience working with python and spark to process data at scale.
• Demonstrated strong communication skills and the ability to work in a highly collaborative environment.
• 2+ years working with Hive (Building and Managing Tables)
• Strong computer science fundamentals, such as algorithms and data-structures.
• Willingness to take responsibility for supporting our production systems including some on call.

Nice To Have
• Any related experience working in Big Data Engineering with (HDFS, S3, Snowflake, Airflow, etc..)

Additional Details
• We are an Equal Opportunity Employer and do not discriminate against applicants due to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, sexual orientation or any other federal, state or local protected class.
• In Colorado, the base salary for this role is between $110,000 and $140,000 and also includes an annual bonus and company equity (NASDAQ: MGNI).
• Perks/Benefits: Equity and Employee Stock Purchase Plan, Pension and Retirement Savings Plan in Several Countries, Comprehensive Healthcare Benefits for You and Your Family, Generous Time Off, Holiday Breaks and Summer Fridays, Family-Focused Leave Benefits, and Cell Phone Subsidy
• Invest in You: Performance Management, and Investment in Diversity Initiatives, Bonusly Peer-to-Peer Recognition Program, Turning Recognition into Tangible Perks and Magnite Swag, Community Service Events, Wellness Coach—Meditate and Recharge with an Unlimited User Account for You and a Plus One
• COVID-19 Precautions: remote interview process, virtual onboarding for new hires
Apply Here
For Remote Big Data Engineer roles, visit Remote Big Data Engineer Roles

********

Senior Data Engineer- Remote at JUUL Labs

Location: San Jose

Juul Labs’ mission is to impact the lives of the world’s one billion adult smokers by eliminating combustible cigarettes. We have the opportunity to address one of the world’s most intractable challenges through a commitment to exceptional quality, research, design, and innovation. Backed by leading technology investors, we are committed to the same excellence when it comes to hiring great talent.

We are a diverse team that is united by this common purpose and we are hiring the world’s best engineers, scientists, designers, product managers, operations experts, and customer service and business professionals. If the opportunity to build your career at one of the fastest growing companies is compelling, read on for more details.

ROLE AND RESPONSIBILITIES:

Leverage Python to design robust, reusable and scalable data solutions and data pipeline frameworks to automate the ingestion, processing and delivery of both structured and unstructured data.

Drive development of large-scale data engineering projects.

Create data pipelines in airflow, DBT and the general suite of Google Cloud Platform

.Build, manage, and support data models. Ensure data quality with data tests in Monte Carlo and Datafold.

Comfortable working in a scrum agile environment using Jira.

Partner with Data Scientists, Data Engineers and Business Analysts to build configurable, scalable, and robust data processing infrastructure

Work closely with our sales, operations, research, and finance teams on data storage, retrieval, and analysis

Develop new systems and toolsto enable stakeholders to consume and understand data more intuitively

Create and establish design standards and assurance processes to ensure compatibility and operability of data connections, flows and storage requirements

Validate model transformationsfor data integrity (source/target tables values and counts are expected, ensurance of proper data cleansing)

Keep Juul on the cutting edge of data technology

Our Data Stack:

Airflow, Fivetran

Google Cloud Platform -GCP (BigQuery, Storage, Dataflow, Pub/Sub, Cloud Functions/Run, Vertex AI, Cloud Build)

DBT

Monte Carlo, Datafold

Tableau

PERSONAL AND PROFESSIONAL QUALIFICATIONS:

4+ years of data engineering or software engineering experience witha focus on data

Prior knowledge in code development using Python to process large-scale datasets and workflows

Have used python libraries and packages (pandas, pyarrow) in conjunction with the Google Cloud Platform (BigQuery, Storage, Pub/Sub)

Knowledgeof bash/shell and orchestration tools (eg Airflow), is preferred.

Experience with version control (Git) and containers (Docker)

Skilled in analytical SQL in support of data modeling and manipulating multiple data formats

EDUCATION:

Preferred masters degree in Computer Science, Engineering, Math, or equivalent experience

JUUL LABS PERKS & BENEFITS:

A place to grow your career. We’ll help you set big goals – and exceed them

People. Work with talented, committed and supportive teammates

Equity and performance bonuses. Every employee is a stakeholder in our success

Boundless snacks and drinks

Cell phone subsidy, commuter benefits and discounts on JUUL products Excellent medical, dental and vision benefits
Apply Here
For Remote Senior Data Engineer- Remote roles, visit Remote Senior Data Engineer- Remote Roles

********

Senior Data Engineer at Tesla Motors

Location: Palo Alto

What to Expect

The Cell Manufacturing team is responsible for performing advanced analytics, creating infrastructure and data pipelines, developing predictive models and making data applications that enable cross functional teams to leverage a wealth of manufacturing, equipment and vehicle data both efficiently and effectively. In this role, you will focus on building data pipelines and infrastructure that power data science systems, tools, software and applications for the Cell Engineering team.

What You’ll Do

Responsibilities:

Analyze and interpret high volume manufacturing data from various sources and assembly operations to extract useful statistics and insights about the operation

Support engineering staff and management in investigations of failures, containment activities, continuous improvement initiatives, etc., to drive meaningful improvements to production quality and output

Work effectively with engineers and conduct end-to-end analyses, from data requirement gathering, to data processing and modeling

Interpret data, analyze results using statistical techniques and provide ongoing reports

Monitor key product metrics, understanding root causes of changes in metrics

Identify, analyze, and interpret trends or patterns in complex data sets and depict the story via dashboards and reports

Create and maintain standardized reporting tools for the operation, providing results on the health of the business for various audiences, including senior management

Maintain existing data visualizations, data pipelines and dashboard enhancement requests

Acquire data from primary or secondary data sources and maintain databases/data systems to empower operational and exploratory analysis

Automate analyses and authoring pipelines via SQL, Python, Tableau, and similar

Drive underlying data systems improvement by working with key cross-functional stakeholders

What You’ll Bring

B.S. degree or higher in quantitative discipline (eg, Computer Science, Mathematics, Physics, Electrical Engineering, Statistics, Industrial Engineering) or the equivalent in experience and evidence of exceptional ability

5+ years of work experience in data engineering and platform engineering

Extensive experience developing software

Works well under pressure while collaborating and managing competing demands with tight deadlines

Experience in high volume manufacturing is a plus where automated assembly is performed and data collected via Manufacturing Execution Systems

Understanding of software and database design to work closely with a development team to translate business needs into software solutions

Preferred Qualifications:

Data mining and database (eg, MySQL) experience

Data visualization experience (eg, Tableau)

Tesla is an Equal Opportunity/Affirmative Action employer committed to diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, age, national origin, disability, protected veteran status, gender identity or any other factor protected by applicable federal, state or local laws.
Apply Here
For Remote Senior Data Engineer roles, visit Remote Senior Data Engineer Roles

********

Principal Engineer, Data Engineering at Ripple

Location: San Francisco

Ripple is the world’s only enterprise blockchain solution for global payments. Today the world sends more than $155 trillion across borders. Yet, the underlying infrastructure is dated and flawed. Ripple connects banks, payment providers, corporates and digital asset exchanges via RippleNet to provide one frictionless experience to send money globally.

In this role, you will architect and implement the data infrastructure for analytics and data centric product feature at Ripple, which will include creating our complete data Platform for unified data Ingestion, distributed systems for processing, self serve data lakes and batch/stream ETL data pipeline for golden datasets for analytics etc. Successful candidates will be able to demonstrate an ability and history of thoughtfulness and curiosity in data ingestion, generation, and pipelining, governance and security of Data at ripple.

This is a senior, high-visibility role that requires a clear architectural vision, the desire to rapidly code, very fast shipping pace, exacting communication and leadership skills, and the ability to educate our firm on the technology we are developing and shipping. You will also represent Ripples Data Platform and Engineering as an expert member, responding to conversations, seeding ideas, and participating architectural discussions.

About The Role

This position is responsible for architecting, designing, implementing, and managing data platforms for Ripple on various hyper scale platforms (AWS, GCP).

You will partner with other engineers and product managers to translate data needs into critical information that can be used to implement scalable data platforms and self-service tools.

Collaborate and ensure business teams by providing technical input to Data Governance policies, standards, and processes related to data with clear data classification and data ownership, access, and security (privacy & protection) of sensitive data.

Work with service teams and other engineering and business partners on Data Infrastructure and Engineering roadmap planning to build the database infrastructure for future scale and volume.

Keeps observability as a focus for all database monitoring and improve/implement auto remediation techniques.

Partner with service and performance teams for continuous architecture improvements, resiliency, and performance.

Own the delivery, quality, and reliability of our Financial Data Hub

Develop data migration architecture for scale and strategy for data migration across clouds.

The ideal candidate will have strong hands-on experience in designing, developing, and managing enterprise level database systems with complex interdependencies and key focus on high-availability, clustering, cloud migration, security, performance, and scalability.

Key Responsibilities

At least 12+ years’ experience in designing and developing enterprise data architecture and engineering solutions that have supported massive workloads and data scale/volume.

Experience working with private and public clouds (AWS, GCP) and capacity management principles.

Design, and implement a scalable data lake, including data integration and curation

Build modular set of data services using Python/Scala,BigQuery/Presto SQL, API Gateway, Kafka, Apache Spark on EMR/data PROC among others

Deep knowledge in Data Warehouse architecture and integration

Research, design, and experiment to execute fast proof of concepts to evaluate similar products.

Participate in the strategic development of methods, techniques, and evaluation criteria for projects and programs. This will include assessment of build vs buy decisions at every stage, backed by proof of concepts, benchmarking, etc.

Experience working autonomously and taking ownership of projects.

Create data applications with ability to do searches, Real Time data alerts, APIs to pull the data on a large volume of data.

Design and implement innovative data services solutions using Microservices and other UI and API related technologies

Implement processes and systems to manage data quality, ensuring production data is always accurate and available for key partners and business processes that depend on it.

Writes unit/integration tests, contributes to engineering wiki, and documents work.

Work closely with a team of Front End and Back End engineers, product managers, and analysts.

Coaching other engineers on best practices for designing and operating reliable systems at scale

Design data integrations and data quality framework.

Execute the migration of data and processes from Legacy systems to new solutions.

Perform production support and deployment activities

Manage the system performance by performing regular tests, solving problems and integrating new features.

Offer support by responding to system problems in a timely manner.

WHAT WE OFFER:

The chance to work in a fast-paced start-up environment with experienced industry leaders

A learning environment where you can dive deep into the latest technologies and make an impact

Competitive salary and equity

100% paid medical and dental and 95% paid vision insurance for employees starting on your first day

401k (with match), commuter benefits

Industry-leading parental leave policies

Generous wellness reimbursement and weekly onsite programs

Flexible vacation policy – work with your manager to take time off when you need it

Employee giving match

Modern office in San Francisco’s Financial District

Fully-stocked kitchen with organic snacks, beverages, and coffee drinks

Weekly company meeting – ask me anything style discussion with our Leadership Team

Team outings to sports games, happy hours, game nights and more!
Apply Here
For Remote Principal Engineer, Data Engineering roles, visit Remote Principal Engineer, Data Engineering Roles

********

The Tech Career Guru
We will be happy to hear your thoughts

Leave a reply

Tech Jobs Here
Logo