Fulltime Data Engineers openings in Boston on September 10, 2022

Senior Data Engineer at Fidelity Investments

Location: Boston

Equity Trading Technology is seeking a Senior Software Engineer with domain expertise in Data Engineering to support our work on an extensive range of Equity Trading products. We are looking for top engineering talent as we design, build and advance our software solutions. The role involves working in an agile team, in a highly collaborative and engaging environment and be adaptable to shifting priorities.

The Expertise and Skills You Bring
• Bachelors degree (or above) in either Engineering or Computer Science related field
• 4+ years’ experience as a software engineer, delivering software using agile development practices
• You can write highly performant and well written PLSQL with proficiency, quality and passion
• Proven experiences and proficiency in performing data analysis
• Exposure to Snowflake, python and AWS preferred
• Proven exposure to Continuous Integration & Continuous Delivery (CI/CD) practices specifically in the data engineering space.
• Proven exposure to test automation frameworks that support all types of testing (unit, component, integration, system etc.) carried out through CI/CD pipelines.
• You can use test-driven approach to developing software and can build automated testing frameworks that can detect any anomalies in functionality, performance and integration.
• You are able to clearly document design options and decisions, and present these in an organized manner to technical as well as non-technical audiences
• You are a strong contributor with a collaborative work style in building, designing and reviewing code
Apply Here
For Remote Senior Data Engineer roles, visit Remote Senior Data Engineer Roles

********

Principal Data Engineer – Remote at LVMH Fashion Group USA

Location: Boston

Our technology team works fast and smart. With San Francisco as our home, we take bringing new tech to market seriously, developing the latest in mobile technologies, scalable architecture, and the coolest in-store client experience. We love what we do and we have fun doing it. The Technology group is comprised of motivated self-starters and true team players that are absolutely integral to the growth of Sephora and our future success.

Your role at Sephora:

As a Principal Software Engineer you will design and implement innovative analytical solutions and work alongside the product engineering team, evaluating new features and architecture. Reporting to the Engineering Manager, Data Platform, you will work closely with other team members like data architects and business analysts to understand what the business is trying to achieve, move data from source to target, and design optimal data models. You will be also responsible for building and maintaining the data platform. This hands-on technical role demands excellent knowledge and can demonstrate best practices in the industry. Come be a part of a team that is starting this new journey.

Responsibilities
• Design and Build Enterprise Analytical solutions using DataBricks, Azure stack of technologies.
• Solid Experience with programming languages such as Scala (or Java)
• Build and scale data infrastructure that powers batch and Real Time data processing
• Streamline the intake of the raw data into our Data lake.
• Develop ETL and implement best practices for ETL development
• Work effectively using scrum with multiple team members to deliver analytical solutions to the business functions.
• Perform production support and deployment activities

We’re excited about you if you have:
• 8 – 10 years of experience with large scale data warehouse projects
• BS in Computer Science or equivalent is required
• Solid Experience with programming languages such as Scala (or Java)
• Expert experience with any of the ETL tools (Informatica, Pentaho, Data Stage, etc )
• Prior experience working with Cloud technologies preferably Azure
• Preferred experience with data integration tools
• Prior experience working with Retail/CRM/Finance datasets
• Very comfortable in designing facts, dimensions, snapshots, SCDs, etc
• UNIX/Linux experience and Scripting skills (shell, Perl, Python, etc.)
• Write complex SQL for processing raw data, data validation and QA
• Experience working with APIs to collect or ingest data.
• Degree in computer science, engineering, MIS, or related field.
• Extensive knowledge and understanding of JavaScript.
• Experience with JavaScript libraries (eg ExtJS, Backbone JS, and Angular JS).
• Strong Database knowledge, COSMOS DB & MySQL preferred
• Communication Skills Data Engineers are part of a team, working with database administrators, data analysts and management and need to be effective communicators.
• Attention to Detail Databases are complex, and a minute error can cause huge problems.
• Problem-Solving Skills Data Engineers look at an issue that needs to be solved and come up with solutions quickly.
• Growth Learner desire to continue to learn about where the BI industry is going and cloud technology.

You’ll love working here because:
• The people. You will be surrounded by some of the most talented, supportive, smart, and kind leaders and teams – people you can be proud to work with.
• The product. Employees enjoy a product discount and receive free product ( gratis ) various times throughout the year. (Think your friends and family love you now? Just wait until you work at Sephora!)
• The business. It feels good to win – and Sephora is a leader in the retail industry, defining experiential retail with a digital focus and creating the most loved beauty community in the world with the awards and accolades to back it up.
• The perks. Sephora offers comprehensive medical benefits, generous vacation/holiday time off, commuter benefits, and Summer Fridays (half-days every Friday between Memorial and Labor Day) and so much more.
• The LVMH family. Sephora’s parent company, LVMH, is one of the largest luxury groups in the world, providing support to over 70 brands such as Louis Vuitton, Celine, Marc Jacobs, and Dior.

Working at Sephora’s Field Support Center (FSC)

Our North American operations are based in the heart of San Francisco’s Financial District, but you won’t hear us call it a headquarters – it’s the Field Support Center (FSC). At the FSC, we support our stores in providing the best possible experience for every client. Dedicated teams cater to our client’s every need by creating covetable assortments, curated content, compelling storytelling, smart strategy, skillful analysis, expert training, and more. It takes a lot of curious and confident individuals, disrupting the status quo and taking chances. The pace is fast, the fun is furious, and the passion is real. We never rest on our laurels. Our motto? If it’s not broken, fix it.

Sephora is an equal opportunity employer and values diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, ancestry, citizenship, gender, gender identity, sexual orientation, age, marital status, military/veteran status, or disability status. Sephora is committed to working with and providing reasonable accommodation to applicants with physical and mental disabilities.

Sephora will consider for employment all qualified applicants with criminal histories in a manner consistent with applicable law.

COMPANY OVERVIEW:

SEPHORA has been changing the face of prestige cosmetics since its debut in 1970s Paris. Sephora was acquired by luxury group Moët Hennessy Louis Vuitton (LVMH) in 1997 then launched stateside in 1998, and is currently home to 200 world-class brands – including its own private label, SEPHORA COLLECTION. Sephoras curated assortment features more than 14,000 products including makeup, skin care, perfume, hair care, body, professional tools and more. Sephora is the beauty education hub, offering consultations at the Beauty Studio, a variety of complimentary classes, one-on-one service from Personal Beauty Advisors, and exclusive retail technology SKINCARE IQ, COLOR IQ, and FRAGRANCE IQ. Sephora is an international force in beauty, and its award-winning website and ever-growing presence on social-media make it the worlds premier digital beauty destination.
Apply Here
For Remote Principal Data Engineer – Remote roles, visit Remote Principal Data Engineer – Remote Roles

********

Senior Data Engineer at Wood Mackenzie

Location: Boston

Company Description

Wood Mackenzie is the global leader in data, analysis and consulting across the energy, chemicals, metals, mining, power and renewables sectors.

Founded in 1973, our success has always been underpinned by the simple principle of providing trusted research and advice that makes a difference to our customers. Today we have over 2,000 customers ranging from the largest global energy companies and financial institutions to governments as well as smaller market specialists.

Our teams are located around the world. This enables us to stay closely connected with customers and the markets and sectors we cover. Collectively this allows us to offer a compelling combination of global commodity analysis with detailed local market knowledge.

We are committed to supporting our people to grow and thrive. We value different perspectives and aspire to create an inclusive environment that encourages diversity and fosters a sense of belonging. We are committed to creating a workplace that works for you and encourage everyone to get involved in our Wellness, Diversity and Inclusion, and Community Engagement initiatives. We actively support flexible working and are happy to consider alternative work patterns, taking into account your needs and the needs of the team or division that you are looking to join.

Hear what our team has to say about working with us:

https://www.woodmac.com/careers/our-people/

We are proud to be a part of the Verisk family of companies!

At the heart of what we do is help clients manage risk. Verisk (Nasdaq: VRSK) provides data and insights to our customers in insurance, energy and the financial services markets so they can make faster and more informed decisions.

Our global team uses AI, machine learning, automation, and other emerging technologies to collect and analyze billions of records. We provide advanced decision-support to prevent credit, lending, and cyber risks. In addition, we monitor and advise companies on complex global matters such as climate change, catastrophes, and geopolitical issues.

But why we do our work is what sets us apart. It stems from a commitment to making the world better, safer and stronger.

It’s the reason Verisk is part of the UN Global Compact sustainability initiative. It’s why we made a commitment to balancing 100 percent of our carbon emissions. It’s the aim of our “returnship” program for experienced professionals rejoining the workforce after time away. And, it’s what drives our annual Innovation Day, where we identify our next first-to-market innovations to solve our customers’ problems.

At its core, Verisk uses data to minimize risk and maximize value. But far bigger, is why we do what we do.

At Verisk you can build an exciting career with meaningful work; create positive and lasting impact on business; and find the support, coaching, and training you need to advance your career. We have received the Great Place to Work® Certification for the fourth consecutive year. We’ve been recognized by Forbes as a World’s Best Employer and a Best Employer for Women, testaments to our culture of engagement and the value we place on an inclusive and diverse workforce. Verisk’s Statement on Racial Equity and Diversity supports our commitment to these values and affecting positive and lasting change in the communities where we live and work.

Job Description

Data Engineers play a key role in our Data organisation, contributing to the creation of our enterprise data assets, processes and integration with our Data Platform. They develop pipelines and processes used to manage data in the Cloud using traditional ETL and RDBMS tools and/or contemporary tools like MAPR, Spark and Lambda. Architecturally they contribute to the conceptualization and design of data flows, platform interfaces, data models and complex modelling solutions.
• Development – Lead the development of pipelines and processes used to manage data on-premise and on the Cloud using traditional ETL and RDBMS tools and/or contemporary tools.
• Data Modeling – contribute to new processes, programs, and procedures to help model structured and unstructured data
• Architecture – contribute to conceptualization, design, and maintenance
• Collaboration with cross-functional teams to provide necessary data and best practices to model applications
• Mentoring – Develop experts in analytic specializations within the Verisk Analytic Community
• Analytics Community – Enable development of the Verisk Analytics Community by advancing Data Engineering as a discipline Software Engineering – Lead the development of complex modelling solution through by writing reusable, testable, and efficient code. Be passionate about secure, reliable and fast software using Rust.
• Industry Research – Share and adopt technical innovations and new developments in relevant analytic fields

Qualifications

You will be a self-starter, energised by a challenge, passionate about bringing great products to market, and love the thrill of creating a new standard for what’s possible. You are a proven leader, able to organise and motivate a team to deliver tangible business benefits. You can adapt to new ways of working – and enjoy collaborating with a wide range of stakeholders. You will ideally have experience in working in the energy sector – preferably in power and renewables generation, power trading, or consulting. You will help design and implement new modelling / simulation algorithms in Rust or Python, and port legacy models into Rust. You will work on projects through their lifecycle, from their initial requirements, implementation and up to production in our dedicated Modelling environment.

Knowledge & experience
• Excellent programming skills, particularly in one or more of Python or Rust
• Demonstrated expertise in at least one analytic specialization
• Ability to solve problems analytically and creatively
• Proactive, self-driven mindset
• Effective communication (written and oral) skills
• Experience with SQL or NoSQL databases
• Knowledge of Linear Programming or a background in Operations Research would be desirable

Additional Information

All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran’s status, age or disability.

http://www.verisk.com/careers.html

Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume.

Consumer Privacy Notice

At Verisk, the health and safety of our people is our number one priority. Effective November 15, 2021, and subject to applicable law, all prospective hires for office based roles or roles that support any of our businesses’ government contracts will be required to demonstrate that they are fully vaccinated against COVID-19 by their start date, or qualify for a legally-required medical or religious accommodation to this vaccination requirement, as a condition of employment. Hired candidates who do not demonstrate that they are fully vaccinated against COVID-19 by their start date, and who have not been approved for a legally-required medical or religious accommodation will no longer meet the requirements for employment and their offers of employment will be immediately rescinded, in accordance with applicable law.
Apply Here
For Remote Senior Data Engineer roles, visit Remote Senior Data Engineer Roles

********

Data Engineer / Senior Data Engineer at OpenSignal

Location: Boston

Opensignal is looking for a Data Engineer to join our Engineering team. If you have a passion for solving difficult problems, a desire to continue learning and strong programming fundamentals, then we want to speak with you.

As a Data Engineer, you’ll join a team focused on building data pipelines to support new and existing products as well as optimizing existing processes and integrating new data sets. The candidate will be involved in all aspects of the software development life cycle, including gathering business requirements, analysis, design, development and production support.

The successful candidate will be responsible for implementing and supporting highly efficient and scalable MSSQL and Python processes. The developer should be able to work collaboratively with other team members, as well as users for operational support. The candidate must be focused, hard-working and self-motivated, and enjoy working on complex problems.
• Department: Engineering
• Reporting to: Director of Engineering
• Location: Remote – US / Boston, MA

What you will be doing?
• Support, maintain and evolve existing data pipelines utilizing MSSQL, SSIS, and Python.
• Implement business rule changes and enhancements in existing data pipelines.
• Automate existing processes.
• Document data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
• Troubleshoot data issues and defects to determine root cause.
• Perform job monitoring, root cause analysis and resolution, and support production processes.
• Perform tuning of SQL queries, and recommend and implement query tuning techniques.
• Assist in data migration needs of the department and company where applicable.
• Develop ETL technical specifications, design, code, test, implement, and support optimal data solutions.
• Create new pipelines in SQL / Python supporting new product development.
• Design and develop SQL Server stored procedures, functions, views, transformation queries and triggers.

Required Skills & Abilities
• Minimum 5-6 years SQL development experience including design, development, testing, implementation and maintenance of SQL Server processes in both AWS and on-prem servers.
• Basic familiarity with Python.
• Solid experience developing complex SQL statements, T-SQL wrappers & procedures, SSIS, functions, views, triggers, etc.
• Excellent query-writing skills.
• Strong knowledge of ETL and Data Warehouse development best practices.
• Experience working with high volume data bases.
• Experience in data mapping, data migration and data analysis.
• Ability to work independently with minimal supervision.
• Strong analytical and problem-solving skills.

At this time, Opensignal will not sponsor a new applicant for employment sponsorship for this position.

This employer participates in E-Verify and will provide the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S.

About US

Opensignal is the leading global provider of independent insight and data into network experience and market performance. Our user-centric approach allows communication providers to constantly improve their network and maximise commercial performance. Leading analysts, investors and financial institutions place a high value on our independent analysis and we are regular contributors to their reports.

Real network experience is our focus and ultimately that’s what influences customer choice. Our mission is to advance connectivity for all and here at Opensignal, the team is leading the industry in enabling operators to link their network experience and market performance in a way that has never before been possible.

With offices in London, Boston and Victoria, British Columbia, we are truly global, with employees working across four continents and representing over 25 nationalities. We are an equal opportunity employer dedicated to building an inclusive and diverse workforce.

Benefits

We believe we are stronger when we not only celebrate our many differences, values, and voices but include them in everyday practice. Having a diverse and inclusive culture is essential, which is why we offer a flexible approach to work-life balance, operating in a remote-hybrid way. We’ll help you get set up with the essentials you need to work from home or the office. We also offer an attractive range of additional benefits, including:
• Competitive compensation packages and global company ownership benefits.
• Comprehensive group benefits package and company sponsored retirement savings plan (details depend on your country of work).
• Professional development opportunities: education reimbursement, facilitator-led training, workshops, knowledge bites (internal learning talks) and more!
• Generous holiday allowance, sick leave, parental leave, flexible working culture and the opportunity to work from abroad.
• Charity matching, paid time off for community volunteering, mentorship, and DE&I program/committees.
• Regular virtual and in-person events and socials.
• We’ll support you to set up an effective home office environment.

We believe we are stronger when we not only celebrate our many differences, values, and voices but include them in everyday practice. Having a diverse and inclusive culture is essential, which is why we offer a flexible approach to work-life balance, operating in a remote-hybrid way. We’ll help you get set up with the essentials you need to work from home or the office. We also offer an attractive range of additional benefits, including:
• Stock options
• Professional development opportunities: education reimbursement, facilitator-led training, workshops, knowledge bites (internal learning talks) and more!
• Purpose led work: we drive conversations around, quality of, and access to, mobile network service, working towards a future where everyone benefits.
• Initial Qualification – Recruitment Team – To understand if you are suitable for the role
• 1st Stage Interview – Take home Technical Assessment – To understand your technical ability with Python and SQL
• 2nd Stage Interview – Bing Quiogue – Director of Engineering – Overview of your background and key projects you have worked with. Technical and scenario-based questions
• 3rd Stage Interview – Derek Parks – Senior Data Engineer – Chemistry conversation and a chance to meet more of the Data Team

AWS, SQL, Python, SparkAWS, SQL, Python, Spark
Apply Here
For Remote Data Engineer / Senior Data Engineer roles, visit Remote Data Engineer / Senior Data Engineer Roles

********

Lead Data Engineer at inTulsa

Location: Boston

InTulsa is partnering with 9b corporation to find candidates for this exciting role in Tulsa

Location: Tulsa, OK. This is a Hybrid position for those already in Tulsa, OK or willing to relocate to Tulsa, OK

We are looking for an experienced Lead Data Engineer with expertise in Amazon Web Services (AWS) to join our growing team in Tulsa, OK
About the job

We are looking for an experienced Lead Data Engineer with expertise in Amazon Web Services (AWS) to join our growing team in Tulsa, OK. In this job, you will get to assist various clients in focus areas such as:
• Criminal Justice Reform in Oklahoma
• Education in Oklahoma
• Nonprofits that serve Tulsa, OK, and Oklahoma City, OK
• Municipalities (cities and counties) across the United States
• Various startups in Upside Down Labs (like ThirdLine )

Since we do client work, you should expect to take on some tasks that do not fit within this exact job description, but in general you will be responsible for:
• Collaborating with clients to understand and prioritize their data reporting needs ?and developing user stories
• Designing and helping administer data architecture in AWS (Azure / GCP a plus)
• Design and lead teams to implement data pipelines
• Exploring clients’ data environment and assessing gaps to meet requirements
• Providing technical expertise on data storage, data mining, and data cleansing
Who we’re looking for

9b does client work, so we are looking for team members who can listen well and quickly adapt to ever-evolving requirements. The ideal candidate is equally comfortable with autonomy and collaboration. The most valuable skill you can bring to our team is an ability and eagerness to learn. In addition to your technical expertise, we are looking for someone who cares about Tulsa and wants to help purpose-driven clients use their data to create impact in the community.
Technical Skills
• 6+ years experience as a Data Engineer
• 6+ years of experience with SQL
• 6+ years of experience with Python
• 6+ years of experience with AWS
• 6+ years of experience with APIs
Personal Skills
• Experience interacting with clients/stakeholders
• Strong interpersonal and teamwork skills
• An ability to adapt to new situations and challenges
• Excellent listening and communication skills
Why work here

Our employees enjoy unlimited PTO, flexible work hours, and a remote-friendly work environment. Employees have access to 401(k) retirement, short-term and long-term disability, and life insurance benefits, plus a stipend for covering healthcare costs. In addition, at the Senior level, you are eligible for a quarterly performance bonus.

9b operates as a Holacracy , which is an organizational structure that emphasizes distributed authority instead of a command hierarchy. You will fill roles with defined accountabilities and engage with your team members to improve our company’s systems, processes, and policies. An autonomous yet collaborative person with creative problem-solving skills will thrive at 9b.
About 9b

9b Corp provides data and analytics solutions to purpose-driven organizations. Our services are customized to each client and function to help our communities thrive. As a Certified B Corporation , we take great pride in our efforts to positively impact not only our team but the greater community around us. Learn more about us at .

All applicants are considered for all positions without regard to race, religion, color, sex, gender, sexual orientation, pregnancy, age, national origin, ancestry, physical/mental disability, medical condition, military/veteran status, genetic information, marital status, ethnicity, alienage or any other protected classification, in accordance with applicable federal, state, and local laws. By completing this application, you are seeking to join a team of hardworking professionals dedicated to consistently delivering outstanding service to our customers and contributing to the financial success of the organization, its clients, and its employees. Equal access to programs, services, and employment is available to all qualified persons. Those applicants requiring accommodation to complete the application and/or interview process should contact a management representative.
Apply Here
For Remote Lead Data Engineer roles, visit Remote Lead Data Engineer Roles

********

Senior Data Engineer at Methodical Search

Location: Boston

Title: Senior Data Engineer

Type: Full-Time Position (Remote)

Location: Remote anywhere in the U.S.

Sponsorship: No sponsorship will be provided for this role. Must be a US Citizen.

Summary:

We are seeking a senior data engineer to join the Engineering Team in this Boston-based company and support a new and rapidly growing data science program within the company to solve big problems faced by members of the US military, veterans, family members, caregivers, and survivors. Reporting to the Director of Engineering, you will work with a combination of US-based software engineers as well as an off-shore contracted development team. You will be responsible for a data warehouse with billions of data points that are protected and ethically used for the benefit of the millions of users sharing this information. Data is the engine that powers the business.

Essential Functions:
• You will be responsible for all aspects of the data infrastructure including: production and development databases, an off-line data warehouse, and a data lake.
• You will be responsible for designing and implementing a cost-effective and efficient data replication and backup strategy.
• While there are aspects of database administration, this is primarily an engineering role and you will be a member of the development team.
• The ideal candidate will be comfortable with database and data pipeline design, have software development skills (the more the better), understand the core concepts around machine learning infrastructure, have worked extensively in the cloud, and care deeply about data security.

Key Responsibilities:
• Write optimized queries, views and triggers for integration with other applications such as data science.
• Oversee data pipelines and a data lake.
• Develop and maintain database replication and clustering.
• Monitor and optimize database performance and capacity planning including backup and recovery.
• Troubleshoot data pipeline issues, maintain data systems availability.
• Plan and execute for data system scalability.
• Oversee data security as part of the overall company information security program.
• Develop and optimize data pipeline design for new applications.
• As required, perform technical research, and oversee special projects.

Required Qualifications:
• 5+ years of experience as a data engineer using cloud based systems. Experience with AWS is a plus.
• Subject matter expertise in PostgreSQL is required. Familiarity with RedShift would be a plus.
• Solid knowledge of server monitoring for detection of emerging issues.
• Experience with data intensive applications used to feed machine learning applications.
• Experience working as part of a distributed development team using an Agile SDLC.
• Degree in Computer Science, Computer Engineering, or other STEM discipline is ideal.
Apply Here
For Remote Senior Data Engineer roles, visit Remote Senior Data Engineer Roles

********

Cloud Data Engineer at The Oakleaf Group

Location: Boston

The Oakleaf Group is a mortgage and financial services consulting firm with expertise in risk management and financial modeling for the mortgage and banking industries. Our clients are banks and non-bank mortgage firms, government agencies, law firms, insurance companies, institutional asset managers and hedge funds.

We differentiate ourselves through our approach with the relationships with our clients. We begin with the belief that each client relationship will be permanent and ongoing, spanning across projects/engagements. We invest in communication and research to ensure that we fully understand the drivers of every client’s short and long term success. We align our goals to those of our clients, and we continuously monitor and adjust to ensure that the relationship stays strong.

For a large financial services client, we are seeking an AWS Developer with Python. In this role you will be focused on AWS Development and Architecture. We are looking for hands-on professionals who can build good technical solutions and then roll up their sleeves to implement the solution.

Responsibilities:
• Work with RedShift ETL to extract, transform, and load databases and data in AWS.
• Build the cloud computing processes working with large data sets and advanced analytics.
• Create processes to take upstream data from data lakes and data marts and apply logic to them using SQL and Python and stored procedures.
• Use primarily AWS native tools.
• Work with financial services data and insure that the data is secure and intact throughout migration.

Required Experience:
• 4+ years of experience with AWS Development and Cloud Architecture.
• 4+ years of experience with Python
• Must have strong experience with SQL.
• 2+ years of experience with AWS S3 and EMR required.
• Strong experience with Amazon RedShift is required.
• Strong experience with data migration, cloud migration, and ETL.
• Strong experience with AWS Lambda, Fargate, SNS, Elastic Beanstalk, ECS, Sagemaker, and CloudWatch.
• Experience with AWS Code Pipeline and CI/CD
• Experience in processing large scale data, database concepts and SQL.
• Experience with job scheduling and Flyway.
• Experience with enterprise data lakes, data warehouses, data marts, and big data.
• Excellent communication skills to ask questions, clarify requirements, and engage with the team and stakeholders.
• Strong logic, reasoning, and critical thinking skills to solve problems as they arise.
• Adaptive to change: Demonstrated ability to problem solve on the fly and tailor your approach to the resources at hand.
• Must be an independent problem solver who can evaluate a situation and build solution options.
• Bachelor’s Degree in computer science, engineering or related is required.

Preferred Experience:
• Experience with Jenkins is preferred.
• Experience with R is preferred.
• AWS Cloud certifications are a plus.
• Master’s degree is preferred.
Apply Here
For Remote Cloud Data Engineer roles, visit Remote Cloud Data Engineer Roles

********

Data Engineer with Python & AWS at SonicJobs

Location: Boston

As a Data Engineer in Investment Science IT group, build and maintain large scale data pipelines and infrastructure. Contribute to building sophisticated analytics, data platform and tools to enable research analysts deliver investment insights
• Must have experience with Python and AWS.
• Must have strong data engineering experience
• Need experience with CI/CD tools
• Financial Services experience is a +
• Should be comfortable working in an agile environment
• Must have excellent communication and collaboration experience
• Experience with Snowflake is a plus
• Extensive experience in ElasticSearch/OpenSearch is a plus
• Experience in SQLAlchemy is a big plus
• Experience in RedShift is a plus
• Experience with DBT is a strong plus –

Responsibilities-
• Design and build out data pipelines and perform ETL on large scale financial data sets
• Leverage ETL programming skills in open source languages including Python, Scala, and SQL on various frameworks
• Deploy DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test-Driven Development to enable the rapid delivery of working code utilizing tools like Jenkins, Nexus, Github, and Docker
• Experience with Cloud computing, preferably AWS and its services including S3, EMR/EC2 and Lambda functions
• Manage multiple responsibilities in an unstructured environment where you’re empowered to make a difference. In that context, you will be expected to research and develop cutting edge technologies to accomplish your goals
• Must have a firm understanding of delivering large-scale data sets solutions and SDLC best practices
• Be a champion of business partners needs and provide inputs into design and development of broader strategic solutions
• Participate in onboarding new datasets and supporting day to day data needs of investors and researchers
• Research and develop cutting edge technologies to accomplish organization’s goals
• Timely communicate progress and any impediments of backlog items to various stakeholders

As a Data Engineer in Investment Science IT group, build and maintain large scale data pipelines and infrastructure. Contribute to building sophisticated analytics, data platform and tools to enable research analysts deliver investment insights
• Must have experience with Python and AWS.
• Must have strong data engineering experience
• Need experience with CI/CD tools
• Financial Services experience is a +
• Should be comfortable working in an agile environment
• Must have excellent communication and collaboration experience
• Experience with Snowflake is a plus
• Extensive experience in ElasticSearch/OpenSearch is a plus
• Experience in SQLAlchemy is a big plus
• Experience in RedShift is a plus
• Experience with DBT is a strong plus –

Responsibilities-
• Design and build out data pipelines and perform ETL on large scale financial data sets
• Leverage ETL programming skills in open source languages including Python, Scala, and SQL on various frameworks
• Deploy DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test-Driven Development to enable the rapid delivery of working code utilizing tools like Jenkins, Nexus, Github, and Docker
• Experience with Cloud computing, preferably AWS and its services including S3, EMR/EC2 and Lambda functions
• Manage multiple responsibilities in an unstructured environment where you’re empowered to make a difference. In that context, you will be expected to research and develop cutting edge technologies to accomplish your goals
• Must have a firm understanding of delivering large-scale data sets solutions and SDLC best practices
• Be a champion of business partners needs and provide inputs into design and development of broader strategic solutions
• Participate in onboarding new datasets and supporting day to day data needs of investors and researchers
• Research and develop cutting edge technologies to accomplish organization’s goals
• Timely communicate progress and any impediments of backlog items to various stakeholders
Apply Here
For Remote Data Engineer with Python & AWS roles, visit Remote Data Engineer with Python & AWS Roles

********

Senior/Software Engineer – Data & Analytics Engineering at Liberty Mutual Insurance

Location: Boston

Are you inspired by the blending of Data and Technology to solve complex business challenges? At Liberty Mutual in Data and Analytics Engineering (DAE), we deliver well engineered solutions that enable the success of our business analytic and data science customers. We are looking for a talented and energetic person who is passionate about using cutting edge technology to deliver business value. DAE embraces a hybrid work culture, offering a full range of work location arrangements.

Job Summary:In this role you will work collaboratively on an agile team to develop and enhance complex systems and/or software from user stories and technical/architectural specifications. You will analyze complex technical system problems and create innovative solutions that exceed customer expectations. This role directly supports the rapidly growing Data Science community at Liberty Mutual.

This is a fast-paced environment providing rapid delivery for our business partners. You will be working in a highly collaborative environment that values speed and quality, with a strong desire to drive change and foster a positive work environment as we continue our agile transformation journey. You will have the opportunity to help lead this change with us as we grow this culture, mindset and capability.

Note: This is a range posting – open to considering at the Engineer or Senior Engineer level.

In this role you will:
• Work in a dynamic and exciting agile environment with Engineers, Scrum Masters, and Product Owners to develop creative data-driven solutions that meet business and technical initiatives
• Improve speed to market by focusing on current Data Science and modeling data needs as well as building out the long-term strategic data solutions using AWS, Java, Python, Lambda, as well as other modern data technologies
• Design and develop programs and tools to support ingestion, curation and provisioning of complex enterprise data to achieve analytics, reporting, and data science
• Demonstrate open minded and collaborative approach to creating innovative technical solutions
• Analyze data and technical system problems to design and implement effective, flexible solutions
• Handle end-to-end development, including coding, testing, and debugging during each cycle
• Develop automated tests for multiple scopes (Unit, System, Integration, Regression)
• Mentor new and junior developers
• Identify and recommend appropriate continuous improvement opportunities

Qualifications:
• Bachelor’s or Master’s degree in technical or business discipline or equivalent experience, technical degree preferred
• Experience developing back end, data warehouse technology solutions
• Experience developing front end interfaces using React
• Knowledge of a variety of data platforms including Teradata, DB2 (Cloud based DB a plus)
• Experience with AWS (such as S3, Snowflake, Athena, EMR)
• Experience with key value storage data concepts (DynamoDB, Cassandra)
• Experience with version control (Atlassian Bitbucket)
• Experience with UI/UX design thinking
• Extensive knowledge of IT concepts, strategies, methodologies.
• Experience working with agile methodologies (Scrum, Kanban, XP) and cross-functional teams (Product Owners, Scrum Masters, Developers, Test Engineers)
• Versed in diverse technologies and new technical architecture principles and concepts
• Demonstrates leadership and active pursuit of optimizing CI/CD process and tools, testing frameworks and practices
• Must be proactive, demonstrate initiative, and be a logical thinker
• Must be team oriented with strong collaboration, prioritization, and adaptability skills required

Additional Qualifications:
• Understanding of Cloud / Hybrid data architecture concepts
• Understanding of insurance industry and products
• Excited by trying new technology and learning new tools

Qualifications
• Bachelor`s degree in technical or business discipline or equivalent experience
• Generally, 3+ years of professional experience
• Strong oral and written communication skills; presentation skills
• Proficient in negotiation, facilitation and consensus building skills
• Proficient in new and emerging technologies
• Thorough knowledge of the following: IT concepts, strategies and methodologies
• Business function(s) and of business operations
• Design and development tools
• Architectures and technical standards
• Thorough knowledge of layered systems architectures and layered solutions and designs; understanding of shared data engineering concepts
• Proficiency in multiple programming languages and tools
• Understanding of agile data engineering concepts and processes Must be proactive and demonstrate initiative and be a logical thinker
• Consultative skills, including the ability to understand and apply customer requirements, including drawing out unforeseen implications and making recommendations for design, the ability to define design reasoning, understanding potential impacts of design requirements
• Collaboration, prioritization, and adaptability skills required

At Liberty Mutual, our purpose is to help people embrace today and confidently pursue tomorrow. That’s why we provide an environment focused on openness, inclusion, trust and respect. Here, you’ll discover our expansive range of roles, and a workplace where we aim to help turn your passion into a rewarding profession.

Liberty Mutual has proudly been recognized as a “Great Place to Work” by Great Place to Work® US for the past several years. We were also selected as one of the “100 Best Places to Work in IT” on IDG’s Insider Pro and Computerworld’s 2020 list. For many years running, we have been named by Forbes as one of America’s Best Employers for Women and one of America’s Best Employers for New Graduates-as well as one of America’s Best Employers for Diversity. To learn more about our commitment to diversity and inclusion please visit: https://jobs.libertymutualgroup.com/diversity-inclusion

We value your hard work, integrity and commitment to make things better, and we put people first by offering you benefits that support your life and well-being. To learn more about our benefit offerings please visit: https://LMI.co/Benefits

Liberty Mutual is an equal opportunity employer. We will not tolerate discrimination on the basis of race, color, national origin, sex, sexual orientation, gender identity, religion, age, disability, veteran’s status, pregnancy, genetic information or on any basis prohibited by federal, state or local law.
Apply Here
For Remote Senior/Software Engineer – Data & Analytics Engineering roles, visit Remote Senior/Software Engineer – Data & Analytics Engineering Roles

********

Staff Data Engineer at Thermo Fisher Scientific

Location: Boston

Title: Staff Data Engineer

Requisition ID: 216239BR

At Thermo Fisher Scientific, our work has a purpose. Our work requires passion and builds meaningful outcomes – our work matters. We share our expertise and technological advancements with customers, helping them make the world a better place. Whether they’re discovering a cure for cancer, protecting the environment, or ensuring our food is safe.

Our people share a common set of values – Integrity, Intensity, Innovation, and Involvement. We work together to accelerate research, tackle complex analytical challenges, improve patient diagnostics, drive innovation, and increase laboratory productivity. Each one of us contributes to our mission every day – to enable our customers to make the world healthier, cleaner, and safer.

Location/Division Specific Information

On-site (anywhere in the United States), Hybrid, or Remote.

Discover Impactful Work

As part of the Machine Learning Operations (MLOps) team within R&D AI Engineering, you will be responsible for owning data lakes, pipelines, data management on-premises and on-cloud (AWS environment) in a fast-paced environment. You will work on sophisticated problems that improve our end users’ experiences, drive growth in a multifaceted company, and use analytics and MLOps technologies. Your goal is to create solutions enabling the development and deployment of AI/ML-enabled products in the cloud and onto edge devices.

A day in the Life

You are a hands-on data scientist/engineer/architect who wants to make a difference!
• Work with the MLOps team to develop and maintain scalable MLOps frameworks and DataOps tools that can be integrated into ML platforms for R&D data science and AI Engineering teams
• Collaborate with biologists, chemists, experimentalists, and data scientists in other scientific divisions to support their R&D workflows and onboard MLOps frameworks
• Develop data pipelines and manage data ingestion, data transformation, data analysis, data querying, data visualization, modeling, and deployment
• Build systems that integrate existing data lakes and align with other corporate and R&D data lakes

Keys to Success

Education
• Bachelor’s degree in computer science, computer engineering, information systems, or a related field.

Minimum Qualifications You Must Have:
• 5+ years of experience in ETL/data engineering, including cataloging, enrichment, exploration, management, processing, validation, and visualization
• Comfortable with Linux, shell scripting, C/C++, and Python
• Experience with AWS purpose-built databases, including S3, RDS, DynamoDB, Redshift, and Database Migration Service
• Experience with structured/unstructured datasets and data store tools
• Excellent oral and written communication skills to present technical information to both business and technology teams with clarity and precision
• Resourcefulness, creativity, excellent interpersonal skills, attention to detail, and the ability to think critically and solve problems that are in line with business objectives and strategic vision

Preferred Qualifications to Make You Stand Out from the Crowd:
• Experience in a wide variety of data formats, including Parquet, Avro, and Protocol Buffers
• Knowledge of vendor-neutral data lakes, data pipelines, and data stores, and experience with Databricks and Snowflake for specific business use cases
• Experience with Apache Spark and Apache Hadoop
• Experience with AWS tools, including Athena, Step Function, CloudFormation, and Kinesis
• Experience with ML compute and ML model management platforms
• Knowledge of ML model development tools such as Keras, PyTorch, TensorFlow, and Jupyter
• Experience with Scala and Golang
• Excellent grasp of software practices in an agile development environment

At Thermo Fisher Scientific, each one of our 65,000 extraordinary minds has a unique story to tell. Join us and contribute to our singular mission—enabling our customers to make the world healthier, cleaner and safer.

Apply today! http://jobs.thermofisher.com

Thermo Fisher Scientific is an EEO/Affirmative Action Employer and does not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability or any other legally protected status.
Apply Here
For Remote Staff Data Engineer roles, visit Remote Staff Data Engineer Roles

********

The Tech Career Guru
We will be happy to hear your thoughts

Leave a reply

Tech Jobs Here
Logo