Fulltime Data Engineers openings in Boston on September 09, 2022

Data Engineer at CNHI

Location: Boston

As a Data Engineer you will enable data-driven decision making within the CNH Digital organization. The Data Analytics team is responsible for planning, implementing, monitoring and continuously improving the CNH Digital infrastructure. The team supports all aspects including but not limited to evolving governance processes, security, maintenance, daily operations, logistics, engineering and equipment management.
• Develop, implement and maintain the information data lake and using insight platforms to enable decision support systems for the overall organization. Partner with the business and our customers to understand their data and reporting requirements.
• Bring data sets together to answer business questions and drive growth. Design ETLs to ingest the data into the data warehouse and data lake, as well as end-user facing reporting applications.
• Collaborate with business customers and development teams to define requirements and deliver flexible, scalable, end-to-end solutions.
• Work with new technologies while driving Business Intelligence solutions end-to-end: business requirements, data modeling, ETL, metadata, reporting, and dashboarding.
• Apply expertize in the design, creation, management, and business use of large datasets.
• Defines SLA for all areas of ownership.

Position Pays $83,000 to $120,000 (Actual salaries will vary and will be based on various factors, such as skill, experience and qualification for the role.)

Qualifications

Qualifications:
• Bachelor’s Degree in Computer Science, Information Systems, Mathematics, Statistics, or related field.
• 1+ years of career experience, or equivalent combination of education and experience.
• Experience with Data modeling, SQL, ETL, Data Warehousing and Datalakes
• Experience in writing SQL and Python scripts
• Experience with enterprise-class Business Intelligence tools such as PowerBI, etc

Preferred Qualifications:
• Experience working with Spark and Databricks
• Ability to balance and prioritize multiple conflicting requirements with high attention to detail.
• Strong verbal/written communication & data presentation skills, including ability to succinctly summarize key findings and effectively communicate with both business and technical teams.
• Familiarity of Scripting language such as Python, PySpark, or Javascript
• Knowledge of Azure Databricks and MPP databases
• Exposure to predictive/advanced analytics and tools (such as R)
• Experience with Datalake development
• Exposure to noSQL databases (such as CosmosDB, MongoDB)
• Experience following the Agile processes
• Experience working with distributed version control system like Git
Apply Here
For Remote Data Engineer roles, visit Remote Data Engineer Roles

********

Manager, Data Engineering at Chewy

Location: Boston

Chewy is seeking a Manager of Data Engineering to join our IT Engineering team in Dania Beach, FL or Boston, MA. This new position will lead the engineering technology team responsible for building data products that support data science, analytics, and automation for Chewy. The data engineering team owns implementation, data stewardship, data architecture, and data transformation. The ideal candidate will have proven, hands-on experience managing a team of engineers and maturing team operations. Above all other factors, we are looking for smart, driven candidates who want to be part of a culture of innovation and creativity. This role will report to the Director of Data Engineering.

What Youll Do:

Manage and lead a team of 5-7 experienced data engineers responsible for building Chewys data platforms.

Drive technical decisions and be accountable for the delivery of solutions on initiatives they own.

Contribute to road map definition in collaboration with technical and product leadership.

Must be able to identify and manage priorities within the context of overall company objectives based on data driven decision making.

Establish strong working relationships at all organizational levels and across functional teams as the subject matter expert for your team.

Partner with our recruiting team to continue to build your team and mentor the growth of the existing team.

Review code and implementations, provide feedback for technical contributors.

Establish technical standards and guidelines to ensure integrity of systems and compliance with company IT standards, policies, and processes.

Ensure service level agreements with business units are met.

Implement operations processes that cover day-to-day jobflow execution and a support model that reduces support requirements on engineers.

Implement process measurements to ensure quality standards and throughput is maintained by each engineer.

What Youll Need:

Demonstrated experience initiating and leading enterprise data engineering initiatives.

Experience overseeing the ongoing operation of a data engineering team.

Development background in multiple large scale projects involving data integration.

Strong technical Back End experience with SQL, NoSQL, and Big Data platforms (and related technologies).

Demonstrated ability to think/act strategically and influence key leaders and partners.

10+ years of related IT work experience including architecture, design, deployment and systems lifecycle development management.

4+ years of leadership experience in managing cross functional teams and direct management of teams (functional and technical).

Strong analytics and communications background (eg, math, statistics, quantitative methods, and verbal and writing proficiency).

Must have a proven track record of working with Senior and C-level executives to determine analytics data requirements and implementing systems to warehouse and retrieve key business information.

Proven delivery experience working under an Agile/Scrum methodology.

Position may include occasional travel.

Bonus:

eCommerce experience.

AWS experience.

Experience with CI/CD processes and releases management.

Experience with Snowflake or Vertica.
Apply Here
For Remote Manager, Data Engineering roles, visit Remote Manager, Data Engineering Roles

********

Data Engineer – REMOTE at OCTO CONSULTING GROUP

Location: Boston

Octo is an industry-leading, award-winning provider of digital services for the federal government. Octo specializes in providing agile software engineering, user experience design, cloud services, and digital strategy services that address government’s most pressing missions. Octo delivers intelligent solutions and rapid results, yielding lower costs and measurable outcomes.

Our team is what makes Octo great. At Octo youll work beside some of the smartest and most accomplished staff youll find in your career. Octo offers fantastic benefits and an amazing workplace culture where you will feel valued while you perform mission critical work for our government. Voted one of the region’s best places to work multiple times, Octo is an employer of choice!

Job Description

You

As Data Engineer, you will work within a team providing Data Warehouse and Business Intelligence services to our government customer using Agile processes. You will work with varying huge data sources with different schemas and data elements to produce an effective and efficient Data Warehouse. You have an eye for spotting data correlations and a desire to dig into large datasets to find technical solutions and deliver business value.

Us

We were founded as a fresh alternative in the Government Consulting Community and are dedicated to the belief that results are a product of analytical thinking, agile design principles and that solutions are built in collaboration with, not for, our customers. This mantra drives us to succeed and act as true partners in advancing our client’s missions.

Program Mission

The program you will be supporting has a mission to provide development, security, and operations (DevSecOps) support to U.S. Citizenship and Immigration Services (USCIS) with a focus on development, operations, and modernization of the Agency’s Enterprise Data Warehouse/Data Lake. The team utilizes open-source, AWS Cloud, and Big Data technologies, agile project management practices, and modern DevSecOps delivery to provide the Business Intelligence support systems to meet the reporting, data analytics, and machine learning/artificial intelligence needs critical to USCIS leadership, data/business analysts, data scientists, and other decision-makers.

Skills & Requirements

Requirements
• Interact with designated product owners, system owners, and source system business owners to understand transactional system data models and elicit requirements and logic for ETLs
• Develop ETL workflows/data pipelines to ingest data using AWS Data Migration Service (DMS), Scala, Kafka, Restful APIs, and other technologies as determined by the client from multiple transactional systems to the target (including ODS, data marts, and data lake) according to documented logic and source-to-target mappings
• Re-develop Legacy ETL code (ie PL/SQL, Materialized Views, Informatica PowerCenter) into modernized data pipelines
• Troubleshoot data discrepancy and missing data issues resulting from daily ETL loads
• Work with operations team to deploy ETL jobs in integration and production environment and debug/troubleshoot critical production issues.
• Develop JSON Objects for a JSON Schema Factory that call upon approved data standards to generate form schemas
• Actively participate in Agile release development activities and ceremonies, including sprint planning, sprint grooming, artifact creation, sprint testing, demonstrations and retrospectives and solution releases.
• Document ETL logic, mappings, etc. in a concise and traceable manner to be used as a reference for future development and maintenance
• Execute other activities related to development work, such as participate in meetings, provide briefings, presentations and other support materials that will promote the program, assist in achieving user buy-in, and explain technical concepts to non-technical audiences

Desired Skills
• 5+ years of experience with ETL development ingesting data from diverse and huge data sources
• 5+ years of experience with programming languages such as Java, Scala, Python, R, JSON Schema
• 5+ years of experience producing and consuming Rest APIs. AWS Database Migration Service (DMS), Databricks/Apache Spark, and/or Kafka experience highly desired.
• 5+ years of experience with relational databases used to support BI analytics. Postgres and Oracle experience highly desired.
• Demonstrated experience in a Data Warehouse/Data Lake and Business Intelligence environment
• Ability to write complex SQL queries and scripts
• Strong teamwork, co-ordination, planning and influencing skills
• Self-driven with the ability to adapt quickly, work in a challenging and fast paced environment within cross-functional teams, and to promote creative problem solving within their team
• Knowledge of the Agile, including Scrum and Kanban, and management tools (eg, Jira, Confluence)
• Familiarity with GIT and branching strategies
• Experience with engineering/DevOps tools (ie Jenkins)
• Excellent analytical, communication and organizational skills
• Experience working in AWS Cloud environment
• Experience with Microsoft Office Suite including Excel, PowerPoint, and Visio

Education: Bachelor’s degree in a technical discipline preferred – Computer Science, Mathematics, or equivalent technical degree, or the equivalent combination of education, professional training, and work experience.

Location: Reston, VA – Currently fully remote due to COVID-19.

Clearance: Must be a US Citizen and be able to obtain a government agency Suitability Clearance. USCIS Entry on Duty (EOD) preferred.

Octo is an Equal Opportunity/Affirmative Action employer. All qualified candidates will receive consideration for employment without regard to disability, protected veteran status, race, color, religious creed, national origin, citizenship, marital status, sex, sexual orientation/gender identity, age, or genetic information. Selected applicant will be subject to a background investigation.
Apply Here
For Remote Data Engineer – REMOTE roles, visit Remote Data Engineer – REMOTE Roles

********

Data Engineer (Somerville, MA or Remote) at Tulip

Location: Somerville

Tulip, the leader in frontline operations, is helping companies around the world equip their workforce with connected apps, leading to higher quality work, improved efficiency, and end-to-end traceability across operations. Companies of all sizes and across industries have implemented composable solutions with Tulip’s cloud-native, no-code platform to solve some of the most pressing challenges in operations: error-proofing processes and boosting productivity, capturing and analyzing real-time data, and continuous improvement.

A spinoff out of MIT, Tulip is headquartered in Somerville, MA, with offices in Germany and Hungary. Focused on composable, human-centric solutions for industrial environments, Tulip is disrupting the MES category and has been recognized as a World Economic Forum Global Innovator.

About You:
• You love a good challenge and learning new things.
• You have gotten disparate systems working robustly together. If you have not already, you want to control the physical world from the web.
• You love building new interfaces that push the product forward and delight users.
• You’re able to own a core part of the product and juggle the different requirements that come along with it.
• You are comfortable moving around a large technology stack to understand how those features work and contribute to different parts of the platform.

What skills do I need?
• Highly proficient in SQL distributed data processing pipelines (Spark, Glue, Airflow)
• Experience building and maintaining a data warehouse (Snowflake, Redshift)
• Works well as an individual or as part of a team, with effective communication

Key Responsibilities:
• Develop and maintain the data warehouse.
• Develop new business metrics and improve existing ones.
• Work directly with stakeholders to identify data needs/requirements.
• Produce clean, efficient code based on specifications
• Test and deploy programs and systems
• Work with developers to design algorithms and flowcharts
• Integrate software components and third-party programs
• Troubleshoot, debug and upgrade existing software
• Gather and evaluate user feedback
• Recommend and execute improvements
• Create technical documentation for reference and reporting

Key Collaborators:

Engineering, Product Management, Hardware, Commercial Teams

Working At Tulip

We know even great candidates experience imposter syndrome. Even if you don’t match every requirement, applying gives you the opportunity to be considered.

We’re building a strong, diverse team that values hard work, families, and personal well-being. Benefits of working with us include:
• Direct impact on product and culture
• Company equity
• Competitive benefits package including Health, Dental, Vision, Short-term Disability, Long-term Disability, Life Insurance, AD&D Insurance, Flexible Spending Account (FSA), Commuter Benefits, Parental Leave, and 401(K)
• Flexible work schedule and unlimited vacation policy
• Virtual company events and happy hours
• Fitness subsidies

We are an equal opportunity employer. At Tulip, we celebrate all. Qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Help us build an inclusive community that will transform frontline operations.
Apply Here
For Remote Data Engineer (Somerville, MA or Remote) roles, visit Remote Data Engineer (Somerville, MA or Remote) Roles

********

Principal Data Engineer – Telecommute at UnitedHealth Group

Location: Boston

Combine two of the fastest-growing fields on the planet with a culture of performance, collaboration and opportunity and this is what you get. Leading edge technology in an industry that is improving the lives of millions. Here, innovation is not about another gadget; it is about making health care data available wherever and whenever people need it, safely and reliably. There is no room for error. If you are looking for a better place to use your passion and your desire to drive change, this is the place to be. It’s an opportunity to do your life’s best work.(sm)

We are looking for a Principal Data Engineer to assist in the continued development of our award-winning health analytics platform, which combines cutting-edge technology, data informatics services and cloud-based analytics tools. In this exciting role, you will collaborate with some of the smartest developers in the industry, designing and implementing solutions to some of the toughest information challenges facing health care today.

We are looking for talented, innovative, effective engineers who take pride in and ownership of their responsibilities, and who can confidently communicate and champion their ideas.

You’ll enjoy the flexibility to telecommute from anywhere within the U.S. as you take on some tough challenges.

Primary Responsibilities:
• Meet the Data Engineering needs of the Optum Analytics Life Sciences business as part of the Life Science Data Factory team.
• Perform all phases of data engineering including: requirements analysis, design, development and testing
• Design and implement features in collaboration with business and IT stakeholders
• Design reusable workflows, components, frameworks and libraries
• Design and develop innovative solutions to meet the needs of the business
• Review code and provide feedback to peers
• Research and champion new technologies

You’ll be rewarded and recognized for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role as well as provide development for other roles you may be interested in.

Required Qualifications:
• Bachelor’s degree
• 5+ years of experience writing SQL (using tools like Oracle, or …)
• 5+ years of Python / Scala
• 3+ years developing solutions on AWS ecosystem
• Prior experience translating business requirements into production solutions
• Prior experience delivering quality results with minimal oversight

Preferred Qualifcations:
• Master’s Degree
• Background in healthcare data
• Experience leading small teams
• Scala
• Terraform / Jenkins

To protect the health and safety of our workforce, patients and communities we serve, UnitedHealth Group and its affiliate companies require all employees to disclose COVID-19 vaccination status prior to beginning employment. In addition, some roles and locations require full COVID-19 vaccination, including boosters, as an essential job function. UnitedHealth Group adheres to all federal, state and local COVID-19 vaccination regulations as well as all client COVID-19 vaccination requirements and will obtain the necessary information from candidates prior to employment to ensure compliance. Candidates must be able to perform all essential job functions with or without reasonable accommodation. Failure to meet the vaccination requirement may result in rescission of an employment offer or termination of employment

Careers with Optum. Here’s the idea. We built an entire organization around one giant objective; make health care work better for everyone. So when it comes to how we use the world’s large accumulation of health-related information, or guide health and lifestyle choices or manage pharmacy benefits for millions, our first goal is to leap beyond the status quo and uncover new ways to serve. Optum, part of the UnitedHealth Group family of businesses, brings together some of the greatest minds and most advanced ideas on where health care has to go in order to reach its fullest potential. For you, that means working on high performance teams against sophisticated challenges that matter. Optum, incredible ideas in one incredible company and a singular opportunity to do your life’s best work.(sm)

Colorado, Connecticut or Nevada Residents Only: The salary range for Colorado residents is $97,300 to $176,900. The salary range for Connecticut / Nevada residents is $97,300 to $176,900. Pay is based on several factors including but not limited to education, work experience, certifications, etc. In addition to your salary, UnitedHealth Group offers benefits such as, a comprehensive benefits package, incentive and recognition programs, equity stock purchase and 401k contribution (all benefits are subject to eligibility requirements). No matter where or when you begin a career with UnitedHealth Group, you’ll find a far-reaching choice of benefits and incentives.

All Telecommuters will be required to adhere to UnitedHealth Group’s Telecommuter Policy.

Diversity creates a healthier atmosphere: UnitedHealth Group is an Equal Employment Opportunity/Affirmative Action employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin, protected veteran status, disability status, sexual orientation, gender identity or expression, marital status, genetic information, or any other characteristic protected by law.

UnitedHealth Group is a drug – free workplace. Candidates are required to pass a drug test before beginning employment.
Apply Here
For Remote Principal Data Engineer – Telecommute roles, visit Remote Principal Data Engineer – Telecommute Roles

********

Senior Software/Data Engineer at Liberty Mutual Insurance

Location: Boston

Senior Software/Data Engineer – Customer and Party API, Global Retail Markets (GRM) US Technology

If you’re a curious learner who wants to develop customer-centric solutions from idea to production within a small, empowered, agile team, join our tech team. As an Software/Data Engineer you will be aligned to an Agile squad building out our future state customer/party APIs with a primary focus of partnering with team(s) to execute on the customer vision, executing modernization efforts of our legacy APIs, influencing the project roadmap, and helping develop and grow our team. We are seeking a candidate with extensive experience designing and building data-driven business-critical applications in Java.

The work we do is at the foundation of our customer strategy. There is an exciting mix of legacy services, cloud-native APIs in production and active development, and many areas ripe for creative problem solving. The customer domain is a key component of the Liberty Mutual strategy to provide our customers with broad, useful and competitively-priced insurance products and services to meet their ever-changing needs. As we continue to make modernization a priority in this space, you would also be involved with other technical leaders in conversations and planning around the future of customer and party efforts.

As the Customer/Party API Software/Data Engineer, you will partner with the aligned Product Owner(s) to understand the vision for the area, services, and capabilities developed and owned by the aligned squad. You will develop a deep understanding of the business needs and goals of a variety of internal and external customers. Orchestrating business priorities with organizational and team technical goals into short- and long-term plans makes for a constant flow of interesting, empowering work.

In this role, you would:
• Work within legacy and modern architectures, frameworks, and tools.
• Contribute to customer/party technical vision along with peers and architects.
• Partner with the Product Owner(s) to develop and continually refine a roadmap that drives technology modernization alongside business priorities.
• Partner with teammates in the development of the customer ecosystem and technical standards based on business plan and vision.
• Propose and implement improvements to increase process efficiency and effectiveness, providing input to solution designs to ensure consistency, security, maintainability and flexibility.
• Continually enhance full delivery pipeline through automation, expanded yet increasingly efficient test coverage, ultimately optimizing time-to-market and quality
• Support aligned team with challenging or critical production support work as required.
• Deliver artifacts, technical documents and designs that meet business requirements.
• Participate in and/or lead design and code reviews.
• Contribute to decisions that impact profitability and operational effectiveness.

Qualifications

This role might be for you if you have:
• Bachelor’s or Master’s degree in technical or business discipline or equivalent experience, technical degree preferred.
• Java – Extensive experience designing and building data-driven business-critical applications
• Generally, 2+ years of professional experience in software or data engineering with API development experience
• Knowledge of IT concepts, strategies, methodologies.
• Experience working with agile methodologies (Scrum, Kanban, XP) and cross-functional teams (Product Owners, Scrum Masters, Developers, Test Engineers)
• Versed in diverse technologies and new technical architecture principles and concepts including extensive knowledge in layered systems architectures solutions and designs and shared software concepts.
• Strong unit testing and debugging abilities and/or experience with Test Driven Development (TDD)
• Strong Structured Query Language SQL syntax knowledge. Familiar with a wide variety of RDBMS installations
• Experience implementing Web services
• Knowledge of OpenAPI Specification (formerly known as Swagger Specification)
• Strong understanding and experience working with application running on Cloud Foundry
• Strong understanding and experiencing using Git Repository and Bamboo or other pipeline system
• Strong troubleshooting and analytical abilities. Experience troubleshooting high profile enterprise applications

Nice to Have (Not Required):
• Experience using IBM WebSphere Application Server (WAS) and/or WAS LP
• Experience using IBM Business Process Management Platform (BPM)
• Experience designing and implementing enterprise applications leveraging NoSQL databases
• Unix scripting
• Experience implementing applications within AWS ecosystem
• Knowledge and experience with Master Data Management development a plus
• Familiarity with other languages or frameworks such as Python, Node.js, Golang

At Liberty Mutual, our purpose is to help people embrace today and confidently pursue tomorrow. That’s why we provide an environment focused on openness, inclusion, trust and respect. Here, you’ll discover our expansive range of roles, and a workplace where we aim to help turn your passion into a rewarding profession. We value your hard work, integrity and commitment to make things better, and we put people first by offering you benefits that support your life and well-being. To learn more, please visit

Liberty Mutual has proudly been recognized as a Great Place to Work by Great Place to Work® US for the past several years. We were also selected as one of the 100 Best Places to Work in IT on IDG’s Insider Pro and Computerworld’s 2020 list. We have been named by Forbes as one of America’s Best Employers for Women and one of America’s Best Employers for New Graduates-as well as one of America’s Best
Employers for Diversity. To learn more about our commitment to diversity and inclusion please visit:

Liberty Mutual is an equal opportunity employer. We will not tolerate discrimination on the basis of race, color, national origin, sex, sexual orientation, gender identity, religion, age, disability, veteran’s status, pregnancy, genetic information or on any basis
prohibited by federal, state or local law.
Apply Here
For Remote Senior Software/Data Engineer roles, visit Remote Senior Software/Data Engineer Roles

********

Data Engineer III at Chewy

Location: Boston

Our Opportunity

Chewy is seeking a Data Engineer III to join our Supply Chain Team. The ideal candidate will have a strong background in system integrations, ETL pipelines, middleware, and automation via APIs. The candidate must be able to work with minimal supervision in a fast paced, rapid growth environment and actively participate as a member of a high performance cross-functional HR transformation initiative.

What You’ll Do
• Implement and maintain data integrations between various on-prem and enterprise cloud services
• Develop and support a comprehensive cloud-based data mart architecture to facilitate the flow, normalization, and synchronization of master data with downstream dependent systems
• Work with Product Managers and Business Analysts to understand functional requirements and interact with other cross-functional teams to architect, design, develop, test, and release features
• Analyze, troubleshoot, and resolve issues to maintain service level agreements
• Recommend long term processes and solutions to ease support issues and stabilize applications and their use
• Create and maintain existing documentation (configuration, test scripts, functional specs for reporting and integration)
• Contributing to continuous improvement and development of business processes.

What You’ll Need
• Candidate must possess a Bachelor’s degree in Computer Science, or related field, or equivalent experience
• 5+ years experience with SQL
• 5+ years experience with Amazon AWS foundational services (S3, EC2, RDS, Lamba)
• 2+ years experience with scripting languages (Python, PowerShell)
• 2+ year experience interacting with web service APIs (SOAP, REST)
• 2+ years experience with GitHub and CI/CD pipelines
• Experience with Web Service testing tools such as SoapUI and Postman
• Ability to quickly understand and decompose HR, business, and technical concepts
• Strong written and verbal communication skills
• Certifications in Workday HCM and AWS a huge plus
• Qualified candidates must have a BS or BA degree in Technology, or equivalent degree

Chewy is committed to equal opportunity. We value and embrace diversity and inclusion of all Team Members.

If you have a disability under the Americans with Disabilities Act or similar law, or you require a religious accommodation, and you wish to discuss potential accommodations related to applying for employment at Chewy, please contact HR@chewy.com.

To access Chewy’s Privacy Policy, which contains information regarding information collected from job applicants and how we use it, please click here: https://www.chewy.com/app/content/privacy).
Apply Here
For Remote Data Engineer III roles, visit Remote Data Engineer III Roles

********

Senior/Software Engineer – Data & Analytics Engineering at Liberty Mutual Insurance

Location: Boston

Are you inspired by the blending of Data and Technology to solve complex business challenges? At Liberty Mutual in Data and Analytics Engineering (DAE), we deliver well engineered solutions that enable the success of our business analytic and data science customers. We are looking for a talented and energetic person who is passionate about using cutting edge technology to deliver business value. DAE embraces a hybrid work culture, offering a full range of work location arrangements.

Job Summary:In this role you will work collaboratively on an agile team to develop and enhance complex systems and/or software from user stories and technical/architectural specifications. You will analyze complex technical system problems and create innovative solutions that exceed customer expectations. This role directly supports the rapidly growing Data Science community at Liberty Mutual.

This is a fast-paced environment providing rapid delivery for our business partners. You will be working in a highly collaborative environment that values speed and quality, with a strong desire to drive change and foster a positive work environment as we continue our agile transformation journey. You will have the opportunity to help lead this change with us as we grow this culture, mindset and capability.

Note: This is a range posting – open to considering at the Engineer or Senior Engineer level.

In this role you will:
• Work in a dynamic and exciting agile environment with Engineers, Scrum Masters, and Product Owners to develop creative data-driven solutions that meet business and technical initiatives
• Improve speed to market by focusing on current Data Science and modeling data needs as well as building out the long-term strategic data solutions using AWS, Java, Python, Lambda, as well as other modern data technologies
• Design and develop programs and tools to support ingestion, curation and provisioning of complex enterprise data to achieve analytics, reporting, and data science
• Demonstrate open minded and collaborative approach to creating innovative technical solutions
• Analyze data and technical system problems to design and implement effective, flexible solutions
• Handle end-to-end development, including coding, testing, and debugging during each cycle
• Develop automated tests for multiple scopes (Unit, System, Integration, Regression)
• Mentor new and junior developers
• Identify and recommend appropriate continuous improvement opportunities

Qualifications:
• Bachelor’s or Master’s degree in technical or business discipline or equivalent experience, technical degree preferred
• Experience developing back end, data warehouse technology solutions
• Experience developing front end interfaces using React
• Knowledge of a variety of data platforms including Teradata, DB2 (Cloud based DB a plus)
• Experience with AWS (such as S3, Snowflake, Athena, EMR)
• Experience with key value storage data concepts (DynamoDB, Cassandra)
• Experience with version control (Atlassian Bitbucket)
• Experience with UI/UX design thinking
• Extensive knowledge of IT concepts, strategies, methodologies.
• Experience working with agile methodologies (Scrum, Kanban, XP) and cross-functional teams (Product Owners, Scrum Masters, Developers, Test Engineers)
• Versed in diverse technologies and new technical architecture principles and concepts
• Demonstrates leadership and active pursuit of optimizing CI/CD process and tools, testing frameworks and practices
• Must be proactive, demonstrate initiative, and be a logical thinker
• Must be team oriented with strong collaboration, prioritization, and adaptability skills required

Additional Qualifications:
• Understanding of Cloud / Hybrid data architecture concepts
• Understanding of insurance industry and products
• Excited by trying new technology and learning new tools

Qualifications
• Bachelor`s degree in technical or business discipline or equivalent experience
• Generally, 3+ years of professional experience
• Strong oral and written communication skills; presentation skills
• Proficient in negotiation, facilitation and consensus building skills
• Proficient in new and emerging technologies
• Thorough knowledge of the following: IT concepts, strategies and methodologies
• Business function(s) and of business operations
• Design and development tools
• Architectures and technical standards
• Thorough knowledge of layered systems architectures and layered solutions and designs; understanding of shared data engineering concepts
• Proficiency in multiple programming languages and tools
• Understanding of agile data engineering concepts and processes Must be proactive and demonstrate initiative and be a logical thinker
• Consultative skills, including the ability to understand and apply customer requirements, including drawing out unforeseen implications and making recommendations for design, the ability to define design reasoning, understanding potential impacts of design requirements
• Collaboration, prioritization, and adaptability skills required

At Liberty Mutual, our purpose is to help people embrace today and confidently pursue tomorrow. That’s why we provide an environment focused on openness, inclusion, trust and respect. Here, you’ll discover our expansive range of roles, and a workplace where we aim to help turn your passion into a rewarding profession.

Liberty Mutual has proudly been recognized as a Great Place to Work by Great Place to Work® US for the past several years. We were also selected as one of the 100 Best Places to Work in IT on IDG’s Insider Pro and Computerworld’s 2020 list. For many years running, we have been named by Forbes as one of America’s Best Employers for Women and one of America’s Best Employers for New Graduates-as well as one of America’s Best Employers for Diversity. To learn more about our commitment to diversity and inclusion please visit:

We value your hard work, integrity and commitment to make things better, and we put people first by offering you benefits that support your life and well-being. To learn more about our benefit offerings please visit:

Liberty Mutual is an equal opportunity employer. We will not tolerate discrimination on the basis of race, color, national origin, sex, sexual orientation, gender identity, religion, age, disability, veteran’s status, pregnancy, genetic information or on any basis prohibited by federal, state or local law.
Apply Here
For Remote Senior/Software Engineer – Data & Analytics Engineering roles, visit Remote Senior/Software Engineer – Data & Analytics Engineering Roles

********

Sr. Data Engineer at Amazon Web Services (AWS)

Location: Boston

Job Summary

DESCRIPTION

The Amazon Web Services’ (AWS) Event Technology team is looking for a Data Engineer (DE) with a passion for working on cutting edge technology, who obsesses over bar raising experiences for customers, and thrives on the challenge of building something new that will operate at world-wide scale to join our fast paced team. In this role, you will be responsible for implementing modern, creative, and innovative data experiences across a breadth of customer facing web apps, our data warehouse for reporting, and integration with third party applications.

If you are passionate about building state-of-the-art web applications and services that focus on simplicity, innovation, performance, consider applying for this role! Your work will have high visibility across AWS internally, and the global AWS community of developers and users.

What you will work on: The Events Technology team is building a new platform entirely on cutting edge AWS services – including CloudFront, API Gateway, Lambda, GraphQL, SQS, DynamoDB and more, with state-of-the-art open source front end frameworks, libraries, and toolchains to deliver a set of highly scalable, performant, serverless, and responsive webapps for our global events organization. Our application will be used by millions of users each year and generate significant amounts of data through which we can continually improve our customers’ experiences. You will own the end to end data pipeline from infrastructure to insights. Your work will have high visibility across the AWS team and global AWS community of developers and users.

What we are looking for: We are looking for a passionate data engineer to develop a robust, scalable data model and optimize the consumption of data sources we require to generate unique insights about our systems. You will share in the ownership of the technical vision and direction for advanced analytics and insight products. You will work with top-notch technical professionals developing complex systems at scale and with a focus on sustained operational excellence. Members of this team will be challenged to innovate using the latest big data techniques. We are looking for people who are motivated by thinking big, moving fast, and exploring business insights. If you love to implement solutions to hard problems while working hard, having fun, and making history, this may be the opportunity for you.

We want a person with a commitment to team work, who enjoys working on cutting edge technology in a fast-paced environment, is customer centric, and thrives on the challenge of building something new that will operate at world-wide scale.

Responsibilities

In this role, you will have the opportunity to display your skills in the following areas:
• Managing AWS resources including Glue, RDS, Redshift, Quicksight, etc
• Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL and AWS big data technologies
• Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency
• Collaborate with Business Intelligence Engineers (BIEs) to recognize and help adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation
• Collaborate with other tech teams to implement advanced analytics algorithms that exploit our rich datasets for statistical analysis, prediction, clustering and machine learning
• Work directly with our key customer stakeholders to help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers

Role is open to: Boston, Seattle, NYC, Arlington VA, Dallas, Atlanta, Denver.

Inclusive Team Culture

Here at AWS, we embrace our differences. We are committed to furthering our culture of inclusion. We have twelve employee-led affinity groups, reaching 40,000 employees in over 190 chapters globally. We have innovative benefit offerings, and host annual and ongoing learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences. Amazon’s culture of inclusion is reinforced within our 14 Leadership Principles, which remind team members to seek diverse perspectives, learn and be curious, and earn trust.

Work/Life Balance

Our team puts a high value on work-life balance. It isn’t about how many hours you spend at home or at work; it’s about the flow you establish that brings energy to both parts of your life. We believe striking the right balance between your personal and professional life is critical to life-long happiness and fulfillment. We offer flexibility in working hours and encourage you to find your own balance between your work and personal lives.

Mentorship & Career Growth

Our team is dedicated to supporting new members. We have a broad mix of experience levels and tenures, and we’re building an environment that celebrates knowledge sharing and mentorship. We care about your career growth and strive to assign projects based on what will help each team member develop into a better-rounded professional and enable them to take on more complex tasks in the future.

Basic Qualifications
• Bachelor’s degree in computer science, engineering, mathematics, or a related technical discipline, or equivalent experience
• 8+ years of industry experience in software development, data engineering, business intelligence, data science, or related field with a track record of manipulating, processing, and extracting value from large datasets
• 4+ years of experience with traditional RDBMS technologies, SQL and SQL tuning
• 4+ years of experience in data modeling, ETL development, and data warehousing
• Experience in one or more scripting or programming language
Role is open to: Boston, Seattle, NYC, Arlington VA, Dallas, Atlanta, Denver

Preferred Qualifications
• Knowledge of data management fundamentals, distributed systems as it pertains to data storage and computing
• Proficiency in building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets.
• Proficiency in building data products incrementally and integrating and managing datasets from multiple sources.
• Proven success in communicating with users, other technical teams, and senior management to collect requirements, describe data modeling decisions and data engineering strategy
• Experience leading large-scale data warehousing and analytics projects, including using AWS technologies – Redshift, S3, EC2, Data-pipeline and other big data technologies
• Experience providing technical leadership and mentoring other engineers for best practices on data engineering
• Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations
• Masters in computer science, mathematics, statistics, economics, or other quantitative fields.
Amazon is committed to a diverse and inclusive workplace. Amazon is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status. For individuals with disabilities who would like to request an accommodation, please visit https://www.amazon.jobs/en/disability/us.

Amazon is committed to a diverse and inclusive workplace. Amazon is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status. For individuals with disabilities who would like to request an accommodation, please visit https://www.amazon.jobs/en/disability/us.

Company – Amazon Web Services, Inc.

Job ID: A2076598
Apply Here
For Remote Sr. Data Engineer roles, visit Remote Sr. Data Engineer Roles

********

Senior Data Engineer, BI at The Boston Beer Company

Location: Boston

Job Description

We are currently hiring a Senior Data Engineer, BI in Boston, MA or Remote.

This is a great role on a team that is transforming Business Intelligence technology at Boston Beer. The Data Engineer is a technical role with SQL Server and Azure cloud data technology at its core and has ample opportunity for technical growth as we move from traditional data technologies to cloud data architecture. As Boston Beer takes on new challenges in areas like supply chain and digital, this role will be highly involved in ensuring our data platform is scalable, performant, and ready for the future.

Working within the Boston-based Business Intelligence team, this role is responsible for various database activities all centered on delivering impactful Business Intelligence solutions and maintaining and improving our database systems. This is a multi-faceted role that includes technical analysis, design, development, testing and maintenance as well as user interaction including requirements gathering, training, etc. This role will work on our on-premise SQL data warehouse platform and helping to build out the data platform of the future in Azure.

What You’ll Brew:
• Build out and migration to cloud data technologies, which may include (but not limited to) Azure Data Lake, Azure SQL Pools, Apache Spark, Synapse, Azure Data Factory
• SQL Server Database development work including tables, views, stored procedures and functions. SSIS development for Extract Transform Load (ETL) processes as well
• Performance tuning and testing of BI structures and processes
• Participation in the design of data models to house and deliver essential reporting
• Automation with Powershell Scripting
• Consultation with other IT personnel to perform essential analysis on source data
• Collaborate with BI Team members and key BI power users on specific data designs and intended use
• Collaborate on requirements definition, design and build out of key data warehousing and integration deliverables for Supply Chain, Brand Insights, Digital, Sales and other areas
• SQL scripting to perform essential data maintenance functions including reclassifications, some master data setup and other
• This role may also be involved in data preparation and modeling using the Power BI platform as well as Analysis Services-Tabular
• Maintenance and extension of our SQL server jobs and Azure PowerShell runbooks for update of our SQL and Azure environments
• Partner with the BI team to promote and evolve design, development, and testing standards
• Adhere to IT department policies and procedures when performing tasks to ensure controls are in place and departmental goals are met. Adhere to standards for testing, documentation, and change management
• Occasional travel may be required (<10%) to Pennsylvania, Delaware and Ohio breweries What Ingredients You'll Bring: Minimum Requirements • Bachelor’s degree in computer science or equivalent relevant experience. • 5+ years in a data integration or data warehouse development role, preferably in a Microsoft environment • 3+ years hands-on development experience with SQL Server, SSIS, SSAS • 1+ years working with Azure cloud data technologies including migration, integration, and design Powershell scripting experience • Working conditions: Occasional travel may be required (<10%) to Pennsylvania, Delaware and Ohio breweries. Preferred Requirements • Microsoft SQL Server, Azure data technologies, Power BI training and certification all a plus. • Microsoft Power BI platform, including DAX and Power Query expressions and Analysis Services tabular a big plus • Python scripting a plus • MuleSoft, SAP Hana data experience helpful • Other cloud platforms such as AWS a plus Some Perks: Our people are our most important “ingredient.” We hire the best talent; and we reward, develop, and retain them too. In addition to generous healthcare on day one, stock purchase plan, 401k and more, Full Time Boston Beer Coworkers have the following perks available*: • Tuition reimbursement • Fertility/adoption support • Free financial coaching • Health & wellness program and discounts • Professional development & training • Free beer! • Talk to your recruiter about eligibility Boston Beer Corporation is an equal opportunity employer and is committed to a diverse workforce. In order to help ensure reasonable accommodation for individuals protected by Section 503 of the Rehabilitation Act of 1973, the Vietnam Veteran’s Readjustment Act of 1974, and Title I of the Americans with Disabilities Act of 1990, applicants who wish to request accommodation in the job application process can contact jobs@bostonbeer.com for assistance. Apply Here
For Remote Senior Data Engineer, BI roles, visit Remote Senior Data Engineer, BI Roles

********

The Tech Career Guru
We will be happy to hear your thoughts

Leave a reply

Tech Jobs Here
Logo