Fulltime Data Engineers openings in Boston on September 07, 2022

Data Engineer at Radancy

Location: Boston

Radancy Data Engineering is seeking a Data Engineer to support building new data products and services.

About
• Radancy Data Engineering works on data services across product organizations within Radancy, and supports building a customer facing data visualization product. The Data Engineering team supports an enterprise grade recruitment platform focusing on talent acquisition and job opportunity exploration.
• The team has extensive experience in ETL development, works with large scale data in real time, and cross collaborates with other engineering teams across the organization.
• Build and maintain ETL pipelines utilizing Python that connect 1st and 3rd party data
• Work with Cloud Computing Platforms (GCP/AWS), Luigi, Kafka and other open-source technologies
• Conduct data modeling, schema design, and SQL development
• Ingest and aggregate data from both internal and external data sources to build our world class datasets
• Develop and lead the testing and fixing of new or enhanced solutions for data products and reports, including automating ETL testing
• Collaborate with Product Owner and domain experts to recognize and help adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation
• Assist with the development and review of technical and end user documentation including ETL workflows, research, and data analysis
• Work with Product team to define data collection and engineering frameworks
• Build monitoring dashboards and automate data quality testing
• Responsible for daily integrity checks, performing deployments and releases
• Own meaningful parts of our service, have an impact, grow with the company
• 3+ years of Python, SQL, and ETL development
• Bachelors or Masters degree in Computer Science or other related field
• Product / reporting suite experience
• Familiarity with C#, .Net, Kafka, Docker
• Exposure to front end development: HTML, JavaScript, jQuery, Angular or similar libraries
• Exposure / familiarity with Google Cloud Platform / BigQuery / Amazon Redshift
• AdTech experience preferred
• Enthusiastic about working with and exploring new data sets
• Detail oriented and strong communicator

Join the global leader in talent acquisition technologies that’s committed to finding new ways to leverage software, strategy and creative to enhance our clients’ employer brands – across every connection point. We’re looking for unconventional thinkers. Relentless collaborators. And ferocious innovators. Talented individuals who are ready to work towards solutions that transform the way employers and job seekers connect.

Radancy is an equal opportunity employer and welcomes all qualified applicants regardless of race, ethnicity, religion, gender, gender identity, sexual orientation, disability status, protected veteran status, or any other characteristic protected by law. We actively work to create an inclusive environment where all of our employees can thrive.
Apply Here
For Remote Data Engineer roles, visit Remote Data Engineer Roles

********

Data Engineer (Remote Eligible) at Capital One

Location: Boston

Center 1 (19052), United States of America, McLean, VirginiaData Engineer (Remote-Eligible)

Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative, inclusive, and iterative delivery environment? At Capital One, you’ll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Data Engineers who are passionate about marrying data with emerging technologies. As a Capital One Data Engineer, you’ll have the opportunity to be on the forefront of driving a major transformation within Capital One.

What You’ll Do:
• Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies
• Work with a team of developers with deep experience in Public cloud services (AWS), Big Data, machine learning, distributed microservices, and full stack systems
• Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake
• Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community
• Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment
• Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance

Capital One is open to hiring a Remote Employee for this opportunity

Basic Qualifications:
• Bachelor’s Degree
• At least 2 years of experience in application development (Internship experience does not apply)
• At least 1 year of experience in big data technologies

Preferred Qualifications:
• 3+ years of experience in application development including Python, SQL, Scala, or Java
• 3+ years experience with distributed data computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL)
• 3+ years of data warehousing and relational database experience (Snowflake, Oracle, or PostGreSQL)
• 3+ years of experience in PySpark
• 2+ years of experience with AWS services (EC2, EMR, ECS, S3, SNS, Lambda, RDS, EFS, or VPC)
• 2+ years of experience with AWS Security Groups or Identity and Access Management
• 2+ years of experience building CI/CD pipelines
• 2+ years of experience with UNIX/Linux including basic commands and Shell Scripting
• 1+ years of experience working on Real Time data and streaming applications
• 1+ years of experience with NoSQL implementation (Mongo or Cassandra)
• 1+ years of experience with Agile engineering practices

At this time, Capital One will not sponsor a new applicant for employment authorization for this position. No agencies please. Capital One is an Equal Opportunity Employer committed to diversity and inclusion in the workplace. All qualified applicants will receive consideration for employment without regard to sex, race, color, age, national origin, religion, physical and mental disability, genetic information, marital status, sexual orientation, gender identity/assignment, citizenship, pregnancy or maternity, protected veteran status, or any other status prohibited by applicable national, federal, state or local law. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries.

If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at (see below) . All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations.

For technical support or questions about Capital One’s recruiting process, please send an email to (see below)

Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site.

Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC).
Apply Here
For Remote Data Engineer (Remote Eligible) roles, visit Remote Data Engineer (Remote Eligible) Roles

********

Data Engineer at WeSpire

Location: Boston

About WeSpire

WeSpire is an online technology platform that allows world-class enterprise organizations to design, run, and measure positive employee programs in areas such as sustainability, giving and volunteering, wellbeing, diversity, equity and inclusion, and corporate culture. WeSpire employee engagement software and campaigns, driven by proven behavioral science, encourages the entire workforce to make a positive impact at work and in their communities.

Why WeSpire?

We are a growing post-Series B startup located at the center of Boston’s bustling innovation center. Our benefits include:
• Remote-friendly culture, both during normal times as well as our current day-to-day. We care far more about your talents, insights, and ability to execute than your current address
• Public Benefit Corporation
• Truly a work/life balance environment
• Competitive salary to work on important problems with global impact
• Flexible PTO policy
• Generous group health, dental, and vision benefits
• Opportunity for equity in a fast-growing company
• Meaningful work on a platform to help large employers support sustainable development around the globe

With that mission in mind, WeSpire is hiring a Data Engineer (BI and ETL) that will be responsible for implementing our embedded product analytics. This critical role will require the ideal candidate to build off their existing experience in engineering enterprise data and analytics solutions to include embedded analytics and data extract/transform/load (ETL) solutions from a variety of data sources. You will have the opportunity to take on complex data challenges related to product reporting and analytics, work with peers to help manage our data infrastructure, and provide unique insights to our platform.

All this sounds familiar? Ready for your next fulfilling adventure? Enjoy working with an amazing team working to support sustainability, social impact, personal wellbeing and more? Keep reading and we’d love to hear from you!

What you’ll be doing
• Collaborate with both technical and non-technical areas of the business, bridging the gap between the business problem and the technical solution
• Ensure that data is central to client success, product optimizations and long-term strategy
• Deliver on an embedded self-service reporting roadmap allowing our clients to measure and optimize employee engagement
• Implement tracking plans and tagging taxonomies that allow us to track user experience from acquisition to engagement, activation, and retention
• Drive proactive analysis that identifies opportunities to delight our customers
• Identify and operationalize the key inflection points of customer experience into reporting frameworks

What you should have
• 3+ years of professional work experience implementing embedded Analytics programs in a high-growth environment
• A hunger to partner with stakeholders to challenge assumptions, understand needs, use cases, and problems to be solved
• Exceptional analytical and conceptual thinking skills to draw business-relevant conclusions
• Experience with data visualization tools like Tableau, ThoughtSpot, Sisense, and Domo
• Familiarity with cloud technologies and platforms like AWS, Azure, and GCP
• Experience working in an Agile product development lifecycle
• Good knowledge of SQL and Python is a must
• Experience with Snowflake, DBT, Airbyte / Fivetran, Github
• Excellent problem solving skills
• Startup experience a strong plus
• Sense of humor and humility a strong plus

As a world-class employer, WeSpire is committed to providing an environment where any and all people feel welcome, respected, and free to be their authentic selves. We welcome applicants of all gender identities, sexual orientation, educational background, religion, ethnicities, veteran status, and citizenships.

We’d love to learn what you can add to our team and thrive together!
Apply Here
For Remote Data Engineer roles, visit Remote Data Engineer Roles

********

Lead Data Engineer at Quantitative Systems

Location: Boston

• Looking for either mid to senior level (5 – 7 years) or Senior level (7 – 10 years).
• Although title says Lead, role is an independent contributor type (won’t manage the team)
• Ideally we need Engineers; would love folks who have experience working in cloud native. Especially worked with AWS or Azure. Potentially working across No SQL and No SQL Data types; also has Python, PySpark, Scala, G lang or C Sharp
Apply Here
For Remote Lead Data Engineer roles, visit Remote Lead Data Engineer Roles

********

Engineer V, Data Engineering/Architect- Python, SQL-Remote at Omnicell

Location: Boston

Do you want to make a meaningful difference in the quality of healthcare? Omnicell is empowering health systems and pharmacies to radically transform the way they manage medications, so they can achieve the vision of the Autonomous Pharmacy. Thousands of hospitals, pharmacies, skilled nursing facilities and care homes trust Omnicell to provide continuous innovation. They need us to deliver solutions to meet the ever-evolving challenges of the healthcare landscape. We encourage creative problem solving and outside-the-box thinking that only a diverse, well-rounded workforce can bring.

Join us as we build on our powerful combination of advanced automation, predictive intelligence, and expert services to create a safer future for patients-one where medication errors are a thing of the past. You’ll be joining an organization whose culture encourages individual development, rewards intellectual curiosity, and embraces an inclusive environment. Join our growing company and help shape the future at Omnicell!

Leveraging the power of big data, IoT, and data science, Omnicell Customer Experience Intelligent Insights (CxII) team is on a mission to optimize customer experience through data driven business decision making and predictive telemetry. Center to the mission is the data warehouse. We are hiring a Principle Software Engineer/Database Architect who will be a key member of the technical team to design and build a new data warehouse as well as high impact analytics applications. This position is open remotely.

Responsibilities
• Needs understanding. Analyze and determine the greatest value, based on return and effort, that can be delivered to the CX organization.
• Lead/participate in the full lifecycle of the data warehouse design from gathering and understanding available data sources, end-user analytics and reporting needs; architecting solution; detailing the technical design. Develop various data warehouse components: data integration (ETL), databases, and metadata with both traditional and big data technologies.
• Lead/participate in ad hoc analytics projects from requirement gathering, to hands-on implementation, to end-user training.
• Represent the team to interface with internal and external customers and partners.

Knowledge / Skills:
• Strong understanding data warehousing, ETL, business intelligence, analytics, machine learning, and data visualization
• Large scale telemetry pipelines architecture, implementation and optimization, IoT, data modeling and analysis
• Cross organizational collaboration and coordination
• Defining solution based on business needs and data insights
• Extensive experience in designing and building for data warehouses.
• Extensive data engineering experience in big data (Hadoop, Spark, Hive, Python, ) and Microsoft SQL environment (MS SQL).
• Experience business intelligence system development (Tableau, Qlik, ).
• Self-starter and analytical thinker that can work independently and drive projects forward.
• Strong skills in communicating and presenting data-derived insights to non-technical audiences appropriately.
• A track record of mentoring, managing, or leading junior analytics or data engineering staff.

Basic Qualifications
• BS in Computer Science.
• 15+ year of professional experiences, including 5+ years of experience in designing and building new big data-based data warehouse systems and running ETL operations.
• Experience in building business intelligence solutions.
• Experience in handling large datasets and expertise in big data as well as Microsoft database technologies.

Preferred Qualifications
• Domain knowledge and past experiences in health care, medication management, patient care, etc.
• Knowledge and experience in applying machine learning/AI to solve real world problems.
• Experience with Omnicell systems.
Apply Here
For Remote Engineer V, Data Engineering/Architect- Python, SQL-Remote roles, visit Remote Engineer V, Data Engineering/Architect- Python, SQL-Remote Roles

********

Senior Data Engineer – $150k Conversion Salary – Remote Contract to Hire at Burtch Works

Location: Boston

SENIOR DATA ENGINEER – REMOTE CONTRACT TO HIRE:
• **$65+/HR PAY RATE ; 6 MONTH CONTRACT TO HIRE ; $150K CONVERSION SALARY***
• **CITIZENS AND GREEN CARD HOLDERS ONLY***

Tech Stack Requirements: AWS Cloud Environment, Snowflake, EMR, Redshift, Spark, Pyspark, and ETL tools

Essential Functions:
• Research, develop, document, and modify Big Data Lake processes and jobs according to data architecture and modeling requirements, and processes set forth by the senior Business Intelligence (BI) data team members
• Maintain accountability for completing the key EDW/Big Data Lake project activities; communicate project status and risks to project leader
• Troubleshoot issues including connection, failed jobs, application errors, server alerts, and space thresholds within predefined service level agreements (SLAs)
• Proactively maintain and tune SQL code according to EDW and Big Data Lake best practices
• Review and ensure appropriate documentation for all new development and modifications of the big data processes and jobs
• Perform unit testing for solutions developed, and ensure integrity and security of institutional data
• Take technical specifications and complete development of data solutions for analytics
• Take business specifications for small to medium projects/data requests and create technical specifications, and data models
• Perform code reviews of work from other team members

Requirements:

Strong SQL knowledge and skills required

Familiarity with enterprise AWS cloud environment or similar data platform with an emphasis in EMR and Redshift required

Experience with Spark/Spark-streaming, Python required

Knowledge of Postgres and SQLServer preferred

Knowledge of ETL tools such as Informatica or Datastage preferred
Apply Here
For Remote Senior Data Engineer – $150k Conversion Salary – Remote Contract to Hire roles, visit Remote Senior Data Engineer – $150k Conversion Salary – Remote Contract to Hire Roles

********

Data Engineer at O’Reilly Media Inc

Location: Boston

Description

About Your Team

Our data engineering team has a strong focus on delivering high-quality, reliable data to platforms and people within O’Reilly as well as building high-performance, scalable and extensible systems. We are intentional in our search for teammates who are helpful, respectful, communicate openly, and are always willing to do what’s best for our users. We keep a close eye on our pipelines and processes to make sure we’re delivering useful, timely improvements to aid decision-making and data visualization within O’Reilly. The team is broadly distributed across the US in multiple cities and timezones and constantly encourages each other to deliver work that instills pride and fulfillment.
About the Job

We are looking for a thoughtful and experienced data engineer to help grow a suite of systems and tools written primarily in Python. The ideal candidate will have a deep understanding of modern data engineering concepts and will have shipped or supported code and infrastructure with a user base in the millions and datasets with billions of records. The candidate will be routinely implementing features, fixing bugs, performing maintenance, consulting with product managers, and troubleshooting problems. Changes you make will be accompanied by tests to confirm desired behavior. Code reviews, in the form of pull requests reviewed by peers, are a regular and expected part of the job as well.
Job Details

In a normal week, you might:
• Develop a new feature from a user story using Python and PostgreSQL or BigQuery
• Collaborate with product managers to define clear requirements, deliverables, and milestones
• Team up with other groups within O’Reilly (e.g. data science or machine learning) to leverage experience and consult on data engineering best practices
• Review a pull request from a coworker and pair on a tricky problem
• Provide a consistent and reliable estimate to assess risk for a project manager
• Learn about a new technology or paper and present it to the team
• Identify opportunities to improve our pipelines through research and proof-of-concepts
• Help QA and troubleshoot a pesky production problem
• Participate in agile process and scrum ceremonies

Why you’ll love working on our team:
• You’ll be working for a company that embraces and pursues new technology
• You’ll be working with a company that trusts and engages its employees
• We believe in giving engineers the tools and hardware that they need to do their job
• Bi-weekly virtual team hangouts and space to learn new skills (we’re a learning company after all!)
• Great company benefits (health/dental/vision insurance, 401k, etc.)
• We care deeply about work-life balance and treat everyone like human beings first

About You

What we like to see for anyone joining our data engineering teams:
• Proficiency in building highly scalable ETL and streaming-based data pipelines using Google Cloud Platform services and products
• Proficiency in large scale data platforms and data processing systems such as Google BigQuery and Amazon Redshift
• Excellent Python and PostgreSQL development and debugging skills
• Experience building systems to retrieve and aggregate data from event-driven messaging frameworks (e.g. RabbitMQ and Pub/Sub)
• Strong drive to experiment, learn and improve your skills
• Respect for the craft-you write self-documenting code with modern techniques
• Great written communication skills-we do a lot of work asynchronously in Slack and Google Docs
• Empathy for our users-a willingness to spend time understanding their needs and difficulties is central to the team
• Desire to be part of a compact, fun, and hard-working team

Not required, but for bonus points:
• Experience with Google Cloud Dataflow/Apache Beam
• Experience with Django RESTful endpoints
• Experience working in a distributed team
• Knowledge and experience with machine learning pipelines
• Contributions to open source projects
• Knack for benchmarking and optimization
Minimum Qualifications
• 2+ years of professional data engineering (or equivalent) experience
• 1+ year experience of working in an agile environment

About O’Reilly Media

O’Reilly’s mission is to change the world by sharing the knowledge of innovators. For over 40 years, we’ve inspired companies and individuals to do new things-and do things better-by providing them with the skills and understanding that’s necessary for success.

At the heart of our business is a unique network of experts and innovators who share their knowledge through us. O’Reilly Learning offers exclusive live training, interactive learning, a certification experience, books, videos, and more, making it easier for our customers to develop the expertise they need to get ahead. And our books have been heralded for decades as the definitive place to learn about the technologies that are shaping the future. Everything we do is to help professionals from a variety of fields learn best practices and discover emerging trends that will shape the future of the tech industry.

Our customers are hungry to build the innovations that propel the world forward. And we help you do just that.

Learn more:

Diversity

At O’Reilly, we believe that true innovation depends on hearing from, and listening to, people with a variety of perspectives. We want our whole organization to recognize, include, and encourage people of all races, ethnicities, genders, ages, abilities, religions, sexual orientations, and professional roles.
Learn more:
Apply Here
For Remote Data Engineer roles, visit Remote Data Engineer Roles

********

Principal Data Engineer (Python, Spark, AWS) | Boston, MA at Jobs via eFinancialCareers

Location: Boston

Principal Data Engineer (AWS, Python, Spark) | Elite Quant Investment Firm | Boston

One of Boston’s top Quantitative Investment Firms has poured millions of dollars into transforming the technology that powers their quantitative trading systems; these transformations have fueled industry leading success over the past 5 years, and their assets under management have increased by more than $100+ billion in that time period. They are rapidly migrating to AWS and building next generation data systems that are highly automated, scalable, and secure. These systems will handle petabytes of complex market data, supporting multiple types of equities and securities.

This role will design highly performant solutions that model data for this firm’s mission critical trading systems. You will work closely with a small team of senior engineers and collaborate with the business to ensure your solutions are scalable enough to react to changes and opportunities in the market in real time. You will be working in a greenfield environment to build systems and models from scratch, and will be responsible for design, development, and delivery.

Key Responsibilities:
• Design and build next generation cloud native data systems in a greenfield AWS environment
• Enhance existing processes to ensure performance, availability, and data quality are optimized
• Define best practice and act as a thought-leader throughout the firm on AWS adoptions

Required Qualifications:
• BS or equivalent degree
• 6+ years of hands-on experience building data ingestion systems, pipelines, and/or models
• 3+ years of experience with Spark and Python
• Expertise with SQL and Pandas/Numpy
• Experience designing solutions with a range of AWS Services
• Some experience with Kubernetes, C#, and Pytest is preferred but not required

Capitalize on:
• Extremely competitive employee benefits program, 401K, and compensation package
• One of the best-rated work cultures and work/life balances in the business
• Casual dress, collaborative teams, and work from home flexibility
Apply Here
For Remote Principal Data Engineer (Python, Spark, AWS) | Boston, MA roles, visit Remote Principal Data Engineer (Python, Spark, AWS) | Boston, MA Roles

********

Lead Data Engineer (Remote-Eligible) at Capital One

Location: Boston

West Creek 3 (12073), United States of America, Richmond, VirginiaLead Data Engineer (Remote Eligible)

Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative, inclusive, and iterative delivery environment? At Capital One, you’ll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs.

Team Info:

The Finance Technology team at Capital One is searching for innovative and analytical Data Engineers to join our team. Our data engineers are multi-linguists who can speak the languages of how we operate as a business, how that business impacts our financials, and the latest technologies that are reshaping our Finance Tech landscape. In this role, you will be responsible for building ETL data pipelines & frameworks using open source tools on public Cloud platforms.

What You’ll Do:
• Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies
• Leverage ETL programming skills in open source languages including Python, Scala, and SQL on various frameworks
• Provide technical guidance concerning business implications of application development projects
• Deploy DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test-Driven Development to enable the rapid delivery of working code utilizing tools like Jenkins, Nexus, Maven, Github, and Docker
• Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community
• Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance
• Manage multiple responsibilities in an unstructured environment where you’re empowered to make a difference. In that context, you will be expected to research and develop cutting edge technologies to accomplish your goals
• Influence and provide technical guidance concerning business implications of application development projects

Capital One is open to hiring a Remote Employee for this opportunity.

Basic Qualifications:
• Bachelor’s Degree
• At least 6 years of experience in application development (Internship experience does not apply)
• At least 2 years of experience in big data technologies
• At least 1 year experience with cloud computing (AWS, Microsoft Azure, Google Cloud)

Preferred Qualifications:
• 7+ years of experience in application development including Python, SQL, Spark-Scala, or Java PySpark on maven builds, REST, JSON, relational databases, and CICD on Jenkins
• 7+ years of experience developing ETL Solutions
• 4+ years of experience developing, deploying, testing in AWS public cloud
• 4+ years of experience with Cloud computing, preferably AWS and its services including deploying S3, EMR/EC2,SNS,SQS and Lambda functions
• 4+ years experience with Distributed data/computing tools (MapReduce, EMR, S3, Lambda, Kafka, Spark, or MySQL)
• 4+ years of experience of delivering large-scale dataset solutions and SDLC best practices
• 4+ year experience working on Real Time data and streaming applications
• 4+ years of experience with NoSQL implementation (Mongo, Cassandra)
• 4+ years of experience with UNIX/Linux including basic commands and Shell Scripting
• 2+ years of experience with Agile engineering practices
• AWS Certification

At this time, Capital One will not sponsor a new applicant for employment authorization for this position. No agencies please. Capital One is an Equal Opportunity Employer committed to diversity and inclusion in the workplace. All qualified applicants will receive consideration for employment without regard to sex, race, color, age, national origin, religion, physical and mental disability, genetic information, marital status, sexual orientation, gender identity/assignment, citizenship, pregnancy or maternity, protected veteran status, or any other status prohibited by applicable national, federal, state or local law. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries.

If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at (see below) . All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations.

For technical support or questions about Capital One’s recruiting process, please send an email to (see below)

Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site.

Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC).
Apply Here
For Remote Lead Data Engineer (Remote-Eligible) roles, visit Remote Lead Data Engineer (Remote-Eligible) Roles

********

Data Engineer at Paytronix Systems

Location: Newton

The kind of person we’re looking for:

The data science data engineer who is interested in working with data science and large data processing/usage and bringing them to an enterprise level of code quality. Our SaaS platform produces a very high volume of data from different sources and formats. The Data Science team develops the platform to store and process this data and analyze them to produce and productize ideas for our customers. We are a highly team-oriented environment where collaboration between developers and across teams is common – particularly in this role as you will need to work with the data science teams to understand the algorithms as well as the engineering team to be able to move products to production in a reliable way. Giving and receiving of feedback, debate, and ideas for betterment are actively encouraged.

Our company has an open, relaxed, and friendly environment where jokes and silliness are common, yet we’re serious about the work we do. We are a group of bright, curious, and empathetic individuals working together to build data products that bring innovation to the market and provide useful insights to marketers. We value diverse perspectives and believe that encouraging everyone to use their voice ultimately makes our products better!

The kind of stuff you’ll be doing:

Working with the best tools of the modern data stack: Snowflake, Fivetran, HVR, DBT, Airflow (Astronomer), Looker
Being a thought leader in the design of our infrastructure that delivers data to our Data Science Teams
Creating new real time ETLs with data from our transactional platform that will enable better analytical models for our Data Scientists and Analysts
Working with our Engineers on data pipelines for data to be available for use in our transactional platform
Working with our Data Scientists to develop improved data pipelines and models for their use

Who you will be working with:

Data Analysts within the Data Insights Team
Data Engineers and Data Scientists within our team (Data Science)
Engineers and Product Managers within the Transactional Processing Team

The kind of experience you’ll need:

5+ years writing SQL
3+ years writing Python (specific to data processing)
3+ years working with data pipelines
2+ years architecting data warehouse schemas
1+ years working with DBT
1+ year working with modern ETL tools (Fivetran, HVR, or equivalent)
1+ year working with Airflow

The extra stuff that would be nice:

Experience with Snowflake
Experience with Astronomer
Experience with Looker
Experience with Jupyter Notebooks

Benefits:

Medical – Choose from 3 Medical Plans (PPO, HMO, and High-Deductible)
Dental – Select from 2 Dental plans, including adult Orthodontics
Vision – 1 plan available
Company paid Life Insurance, Short-Term Disability (STD) and Long-Term Disability (LTD)
401K with generous company match; 2-year vesting
Flexible Spending Account (FSA)
Health Reimbursement Account (HRA)
Tuition Reimbursement
Generous PTO
One paid charity day

For information on our benefits, see here.
Apply Here
For Remote Data Engineer roles, visit Remote Data Engineer Roles

********

The Tech Career Guru
We will be happy to hear your thoughts

Leave a reply

Tech Jobs Here
Logo