Fulltime Data Engineers openings in Houston, Texas Area on September 06, 2022

Associate Data Engineer at MD Anderson Cancer Center

Location: Houston

152853 Requisition #

The Associate Data Engineer in the area of Data Analytics & Delivery is a role in the Enterprise Data Engineering & Analytics Department in operationalizing critical data and analytics for MD Anderson’s digital business initiatives. The Associate Data Engineer participates in business requirements gathering, components of end-to-end solution development and data analytics delivery within the Context Engine. The Associate Data Engineer partners with other Enterprise Data Engineering & Analytics teams to assist in building components of analytics deliverables for production use by our data and analytics consumers.

The Associate Data Engineer also assists in planning and coordinating components of the data analytics delivery activities in compliance with data governance processes and data security requirements. This results in enabling faster data delivery, integrated data reuse and vastly improved time-to-solution for MD Anderson data and analytics initiatives.

The Associate Data Engineer role will require both creative and collaborative working with Principal, Senior Data Engineers and Data Engineers across the department.

Data Engineering – End-to-End Solution Delivery

Participate in components of End-to-end solution delivery that increases information capabilities and realizes data value across the institution. End-to-End solutions include build out of data sources and tools across the Context Engine framework by integrating data governance processes through data ingestion, ingress, egress, curation, pipeline build, data transformation and modeling steps. Build out of components across data governance processes that consistently tracking data provenance, security, data quality and ontology as well as through to data visualization and insights.

Participate in existing components of end-to-end data pipelines consisting of a series of stages through which data flows (for example, from data sources or endpoints of acquisition to integration to consumption for specific use cases).

Incorporate components of data governance and metadata management processes into the data ingestion, curation and pipeline building efforts.

Participate in data requirements gathering for various components of end-to-end analytics deliverables to ensure we are delivering what is needed, not only what is requested.

Participate and implement components of data analytics deliverables, including data analysis, report requests, metrics, extracts, visualizations, projects or dashboards in a timely manner by leveraging tools and methodologies in line with the Context Engine Strategy.

Perform problem solving and formulation and testing and analysis of data. Designs queries using structure query language and NoSQL.

Adhere institutional data management strategies.

Standards, Testing & System Maintenance

Adhere to standard operating procedures set by IS division as well as all MDA policies and maintain build standards (data steward / governance oversight sign off) for support of MDA Institutional data strategy including Context Engine.

Participate in documentation preparation as needed for the implementation of enhancements or new technology.

Adhere to documented change control processes and may perform change control audits.

Perform quality control and testing and review the build of other analysts to ensure that solutions are technically sound.

Assist in overseeing analytics system updates/new releases for assigned modules.

Adhere to regulatory requirements, quality standards and best practices for systems and processes, and collaborate with internal and external stakeholders.

Participate in after-hours application support and downtime procedures.

Educate and train

Participate in training counterparts, such as data scientists, data analysts, end users or any data consumers, in data pipelining and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases.

Assist in establishing training plans for various systems in the Context Engine Tools suite and develop curricula in partnership with the MDA Training team and EDEA system experts.

Provide institutional, department and one-on-one training on EDEA deliverables.

Assist in supporting liaison relationships with customers and OneIS to provide effective technical solutions and customer service.

OneIS

To provide innovative, quality, and sustainable IT solutions and services. Our success is driven by our people through Integrity and Trust, Partnership, and Quality.

Promotes trust, respect, support, and honestly with customers and each other.

Commits to being a good partner focused on building productive, collaborative, and trusting relationships with our customers and each other.

Models a commitment to excellence and strives to continually improve. Achieves desired outcomes, usability, and value that exceed expectations of others and our own.

Other duties as assigned

Education: Bachelor’s degree.

Preferred Education: Master’s Level Degree

Certification: Must obtain at least one Epic Data Model certification (Clinical, Access, or Revenue) issued by Epic within 180 days of date of entry into job.

Preferred Certification: Python, PySpark, Spark certifications

Experience Required: None, May substitute required education with years of related experience on a one to one basis.

It is the policy of The University of Texas MD Anderson Cancer Center to provide equal employment opportunity without regard to race, color, religion, age, national origin, sex, gender, sexual orientation, gender identity/expression, disability, protected veteran status, genetic information, or any other basis protected by institutional policy or by federal, state or local laws unless such distinction is required by law. http://www.mdanderson.org/about-us/legal-and-policy/legal-statements/eeo-affirmative-action.html

Additional Information
• Requisition ID: 152853
• Employment Status: Full-Time
• Employee Status: Regular
• FLSA: exempt and not eligible for overtime pay
• Work Week: Days
• Fund Type: Hard
• Work Location: Remote (within Texas only)
• Pivotal Position: Yes
• Minimum Salary: US Dollar (USD) 63,000
• Midpoint Salary: US Dollar (USD) 79,000
• Maximum Salary : US Dollar (USD) 95,000
• Science Jobs: No
Apply Here
For Remote Associate Data Engineer roles, visit Remote Associate Data Engineer Roles

********

Senior Data Engineer – Remote (Houston, TX) at The Hartford

Location: Houston

You are a driven and motivated problem solver ready to pursue meaningful work. You strive to make an impact every day & not only at work, but in your personal life and community too. If that sounds like you, then you’ve landed in the right place.

Come join The Hartford’s talented Sales and Distribution (S&D) IT team as Sr Data Engineer to help drive the data vision aligned to the business strategy!! We are seeking a detail-oriented, results-driven Senior Software Engineer to support projects in an agile environment. Successful candidates will demonstrate strong technical, analytical and interpersonal skills accompanied with proven experience in delivering quality technical solutions, as well as setting and executing on a strategic technical vision. In addition, candidates must have an aptitude to understand existing processes and systems and the desire to continually improve these processes and systems. They should be able to make decisions quickly in consultation with team members and build relationships, actively participate in team work, and understand the dynamics and critical nature of the business.

Responsibilities of the position include:
• Provide technical leadership by enabling the vision of the application architecture and safeguard the integrity of the application environment
• Assisting architects in designing and implementing application integration involving a range of applications from third party off premise cloud applications to on premise legacy applications
• Understand and implement the technical vision for projects, or systems, keeping in mind cross-functional impacts, organizational impacts and architecture rationalization
• Operate as a subject matter expert advocating for the software applications supported. Possess a depth and breadth of knowledge for the application’s business, technologies, integration.
• Responsible for end to end technical solution, goes beyond borders to ensure success of overall technical solution. Works closely with vendor software providers to drive optimal solutions.
• Directly develop application components and oversee technical deliverables from junior Developers through the software development life cycle
• Proactively address technical issues and risks that could impact project schedule and/or budget
• Work closely with stakeholders to design and document automation solutions that align with the business needs and also consistent with the architectural vision
• Mentor and train project team members (including sourcing partners)
• Lead change by influencing and educating Business, IT and sourcing partners
• Identify opportunities to reduce total cost of ownership in the operational application support

Qualifications of the position include:
• Bachelor’s Degree or equivalent work experience with at least six or more years of programming/systems analysis developing integration solutions
• Experience with Microsoft SQL services, including SQL Server Integration Services (SSIS) and SQL Server Reporting Services (SSRS)
• Experience with relational databases such as: Oracle, SQL Server, and PL/SQL
• Ability to leverage native integration capabilities of commercial off-the-shelf software
• Demonstrate skills in shaping and leading development of technical specifications
• Agile development framework and DevOps implementation experience a strong plus

Compensation

The listed annualized base pay range is primarily based on analysis of similar positions in the external market. Actual base pay could vary and may be above or below the listed range based on factors including but not limited to performance, proficiency and demonstration of competencies required for the role. The base pay is just one component of The Hartford’s total compensation package for employees. Other rewards may include short-term or annual bonuses, long-term incentives, and on-the-spot recognition. The annualized base pay range for this role is:

$108,480 – $162,720

Equal Opportunity Employer/Females/Minorities/Veterans/Disability/Sexual Orientation/Gender Identity or Expression/Religion/Age

About Us | Culture & Employee Insights | Diversity, Equity and Inclusion | Benefits

Sr Data Engineer – GE07BE

Skills:
Apply Here
For Remote Senior Data Engineer – Remote (Houston, TX) roles, visit Remote Senior Data Engineer – Remote (Houston, TX) Roles

********

ENGINEER, DATA ANALYTICS-991 at Boardwalk Pipelines

Location: Houston

Boardwalk Pipelines, LP primarily provides transportation and storage of natural gas and liquids for our customers. Additional information about the company can be found online at

POSITION DESCRIPTION: REQ

We are currently seeking an Engineer I to Senior , that can sit in our Houston, TX or Owensboro, KY office, preferably Owensboro, KY.

The incumbent will join the Reliability and Data Analytics team in supporting Operations and Gas Control by monitoring our in-house asset condition monitoring program. The monitored asset base consists of 340+ natural gas-fired and electric-driven rotating and reciprocating gas compressors, 140+ electric driven liquid hydrocarbon pumps, and other ancillary equipment.

The mission of this position will be to identify degradation in asset performance and early signs of equipment failure so that the repair can be scheduled with minimal downtime and interruption of pipeline capacity.

DUTIES AND RESPONSIBILITIES LISTED BUT NOT ALL INCLUDED:

The primary function will be monitoring and supporting the development and enhancing of asset health dashboards and trends of live streaming and historical data from various sources.
Will develop and utilize machine learning models to predict target values for specific performance-related parameters.
When an actionable issue is detected, incumbent will perform further trend analysis and investigation to identify the potential root cause of the anomaly, then enter Work Orders in Maximo and collaborate closely with Operations to proactively address these issues with minimal custom impact.
Will verify performance has been regained or the issue has been resolved.

REQUIRED EDUCATION:

Bachelor s Degree in Mechanical Engineering, or other Engineering disciplines

REQUIRED SKILLS, KNOWLEDGE, AND EXPERIENCE :

1-5+ YRS Experience with Maximo or other CMMS/EAM work order management system
1-5+ YRS Experience with Microsoft Power BI
Strong commitment to working safely
Exceptional personal/soft skills, data analytics, problem solving and communication skills
Solid understanding of thermodynamics as it relates to compression and combustion processes
Advanced knowledge of Microsoft Excel

PREFERRED SKILLS, KNOWLEDGE, AND EXPERIENCE :

Experience with Maximo or other CMMS/EAM work order management system
Experience with Microsoft Power BI
Python programming as it related to machine learning
Performing advanced diagnostics
Knowledge of time-series data historians

ADDITIONAL INFORMATION :

Boardwalk Pipelines, LP maintains a drug-free workplace and will require pre-employment drug & substance abuse testing before hire.

Boardwalk Pipelines, LP is an equal opportunity / affirmative action employer. All applicants will be considered for employment regardless of race, color, religion, age, sex, gender identity, national origin, veteran, or disability status.

Job Type: Full-time

Pay: From $80,000.00 per year

Benefits:
401(k)
Dental insurance
Flexible schedule
Health insurance
Paid time off
Vision insurance
Schedule:
8 hour shift

Ability to commute/relocate:
Houston, TX 77046: Reliably commute or planning to relocate before starting work (Required)

Application Question(s):
Do you have a Bachelor s Degree in Mechanical Engineering, or other Engineering disciplines?
Do you have 1-5+ YRS Experience with Maximo or other CMMS/EAM work order management system?
This position will need a solid understanding of thermodynamics as it relates to compression and combustion processes. Have you worked in this area or have knowledge of?

Willingness to travel:
25% (Preferred)

Work Location: One location
Apply Here
For Remote ENGINEER, DATA ANALYTICS-991 roles, visit Remote ENGINEER, DATA ANALYTICS-991 Roles

********

Data Engineer at KForce

Location: Houston

DescriptionKforce’s client is searching for a Data Engineer in Houston, TX.
Duties:
• Put your passion of CICD to work and enjoy the impact it has to software quality and customers
• Live and love Docker, EKS, GitLab, and Terraform
• Build Terraform scripts and other deployment and configuration automation
• Live, laugh, and love some flavor of Agile; With a side of Scrum
• Work closely with other teams and individuals to plan, coordinate, and seek feedback
• Pitch in where needed as a valued team member; There is no -I- in teamRequirements
• Experience in productionizing various big data technologies both open source and cloud native, AWS preferred (Kafka, Airflow, Dremio, etc.)
• Expertise in data model design with sensitivity to usage patterns and goals – schema, scalability, immutability, idempotency, etc.
• Expertise in of at least two of the following languages – Python, Go, Scala, Java
• Experience in handling Large Scale Time Series data
• Experience in GraphQL, Apollo and Hasura
• Track record of choosing the right transit, storage, and analytical technology to simplify and optimize user experience
• Real-world experience developing highly scalable solutions using micro-service architecture designed to democratize data to everyone in the organization
• Docker, K8, Cloud, microservices, containerization, web services, DB/SQL, etc. (You get it)
• Strong analytical, problem-solving, and troubleshooting skills; Let’s face it, you are one of the smartest people you know
• Experienced with modern coding, testing, debugging and automation techniques
• Rave about the benefits of CI/CD, unless manual deployments really are your thing
• Have a high bar for user experience and quality
• You are data driven and customer obsessed
• Good communication skills
Bonuses to include as part of your application:
• Links to online profiles you use such as GitHub, Twitter, etc.
• A description of your work history
Kforce is an Equal Opportunity/Affirmative Action Employer.

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.
Apply Here
For Remote Data Engineer roles, visit Remote Data Engineer Roles

********

Associate Data Engineer at MD Anderson Cancer Center

Location: Houston

The Associate Data Engineer in the area of Data Analytics & Delivery is a role in the Enterprise Data Engineering & Analytics Department in operationalizing critical data and analytics for MD Anderson’s digital business initiatives. The Associate Data Engineer participates in business requirements gathering, components of end-to-end solution development and data analytics delivery within the Context Engine. The Associate Data Engineer partners with other Enterprise Data Engineering & Analytics teams to assist in building components of analytics deliverables for production use by our data and analytics consumers.
The Associate Data Engineer also assists in planning and coordinating components of the data analytics delivery activities in compliance with data governance processes and data security requirements. This results in enabling faster data delivery, integrated data reuse and vastly improved time-to-solution for MD Anderson data and analytics initiatives.
The Associate Data Engineer role will require both creative and collaborative working with Principal, Senior Data Engineers and Data Engineers across the department.
Data Engineering – End-to-End Solution Delivery
Participate in components of End-to-end solution delivery that increases information capabilities and realizes data value across the institution. End-to-End solutions include build out of data sources and tools across the Context Engine framework by integrating data governance processes through data ingestion, ingress, egress, curation, pipeline build, data transformation and modeling steps. Build out of components across data governance processes that consistently tracking data provenance, security, data quality and ontology as well as through to data visualization and insights.
Participate in existing components of end-to-end data pipelines consisting of a series of stages through which data flows (for example, from data sources or endpoints of acquisition to integration to consumption for specific use cases).
Incorporate components of data governance and metadata management processes into the data ingestion, curation and pipeline building efforts.
Participate in data requirements gathering for various components of end-to-end analytics deliverables to ensure we are delivering what is needed, not only what is requested.
Participate and implement components of data analytics deliverables, including data analysis, report requests, metrics, extracts, visualizations, projects or dashboards in a timely manner by leveraging tools and methodologies in line with the Context Engine Strategy.
Perform problem solving and formulation and testing and analysis of data. Designs queries using structure query language and NoSQL.
Adhere institutional data management strategies.
Standards, Testing & System Maintenance
Adhere to standard operating procedures set by IS division as well as all MDA policies and maintain build standards (data steward / governance oversight sign off) for support of MDA Institutional data strategy including Context Engine.
Participate in documentation preparation as needed for the implementation of enhancements or new technology.
Adhere to documented change control processes and may perform change control audits.
Perform quality control and testing and review the build of other analysts to ensure that solutions are technically sound.
Assist in overseeing analytics system updates/new releases for assigned modules.
Adhere to regulatory requirements, quality standards and best practices for systems and processes, and collaborate with internal and external stakeholders.
Participate in after-hours application support and downtime procedures.
Educate and train
Participate in training counterparts, such as data scientists, data analysts, end users or any data consumers, in data pipelining and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases.
Assist in establishing training plans for various systems in the Context Engine Tools suite and develop curricula in partnership with the MDA Training team and EDEA system experts.
Provide institutional, department and one-on-one training on EDEA deliverables.
Assist in supporting liaison relationships with customers and OneIS to provide effective technical solutions and customer service.
OneIS
To provide innovative, quality, and sustainable IT solutions and services. Our success is driven by our people through Integrity and Trust, Partnership, and Quality.
Promotes trust, respect, support, and honestly with customers and each other.
Commits to being a good partner focused on building productive, collaborative, and trusting relationships with our customers and each other.
Models a commitment to excellence and strives to continually improve. Achieves desired outcomes, usability, and value that exceed expectations of others and our own.
Other duties as assigned
Education: Bachelor’s degree.
Preferred Education: Master’s Level Degree
Certification: Must obtain at least one Epic Data Model certification (Clinical, Access, or Revenue) issued by Epic within 180 days of date of entry into job.
Preferred Certification: Python, PySpark, Spark certifications
Experience Required: None, May substitute required education with years of related experience on a one to one basis.
It is the policy of The University of Texas MD Anderson Cancer Center to provide equal employment opportunity without regard to race, color, religion, age, national origin, sex, gender, sexual orientation, gender identity/expression, disability, protected veteran status, genetic information, or any other basis protected by institutional policy or by federal, state or local laws unless such distinction is required by law. http://www.mdanderson.org/about-us/legal-and-policy/legal-statements/eeo-affirmative-action.html
Additional Information
+ Requisition ID: 152853
+ Employment Status: Full-Time
+ Employee Status: Regular
+ FLSA: exempt and not eligible for overtime pay
+ Work Week: Days
+ Fund Type: Hard
+ Work Location: Remote (within Texas only)
+ Pivotal Position: Yes
+ Minimum Salary: US Dollar (USD) 63,000
+ Midpoint Salary: US Dollar (USD) 79,000
+ Maximum Salary : US Dollar (USD) 95,000
+ Science Jobs: No
Apply Here
For Remote Associate Data Engineer roles, visit Remote Associate Data Engineer Roles

********

Big Data Engineer – Paid Relocation to Greeley CO at Pilgrim’s

Location: Houston

We are looking for a Big Data Solutions Architect for Pilgrim’s that is based out of the Corporate Office in Greeley, Colorado.

The mission for this role is to engineer, build, deploy, and maintain robust big data solutions and platforms for business needs.

Responsibilities:
• Responsible for gathering, maintaining, planning, developing and implementing data solutions and tools to support business units across organization.
• Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.
• Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
• Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
• Documents work and processes and apply best-in-class data governance policies.
• Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
• Works closely with a team of frontend developers, product and business managers, analysts, and PMO.
• Designs data integrations and data quality framework.
• Designs and evaluates open source and vendor tools for data lineage.
• Works closely with all business units to develop strategy for long term data platform architecture.
• Manage from start to finish a portfolio of data integration and BI projects, working as the liaison between the business, IT, Digital Transformation teams and third party providers.
• Design rich data visualizations to communicate complex ideas to leaders.
• Lead integration of external vendor data, evaluate new solutions, and set process standards.
• Provides key support for vendor management.
• Lead the planning and execution of assigned projects.
• Other duties as assigned

Qualifications:
• BS or MS degree in Computer Science or a related technical field
• Project management and leadership skills are essential
• 2+ years of Python development experience
• 2+ years of SQL experience
• 2+ years of experience with schema design and dimensional data modeling
• 2+ years of experience with machine learning
• Ability in managing and communicating data warehouse plans to internal clients
• Experience designing, building, and maintaining data processing systems
• Strong analytical and logical skills, curiosity, organization, attention to details, discipline are crucial for this position.
• Ability to perform technical deep-dives into code, networking, systems and storage configuration
• Hands-on experience developing cloud data infrastructure
• Proficient in Python, SQL, and popular cloud platforms (Microsoft Azure, AWS, or Google – Google Cloud Platform).
• Technical hands-on skills in machine learning (regression, classification, clustering, dimensionality reduction), deep learning (CNN, RNN/LSTM), time series data, optimization, anomaly detection, statistical algorithms, and data engineering.
• Excellent understanding of machine learning techniques and algorithms, such as k-NN, Naive Bayes, SVM, Decision Forests, etc.
• Good applied statistics skills, such as distributions, statistical testing, regression, etc.
• Experience with big data technologies such as Hadoop, R, and Java / MapReduce a plus
• Experience with NoSQL databases, such as MongoDB, Cassandra, HBase, DynamoDB a plus
• SAP BI HANA knowledge and QlikSense reporting is a plus
• Working knowledge of relational databases
• Familiar with microservices architecture
• Good knowledge of Restful API development, know how to design and develop Restful API
• Git code management practices (GitHub)
• Experience in writing unit tests to ensure code quality
• Current understanding of best practices regarding system security measures
• Experience in data engineering and design architecture
• Demonstrated experience with agile development methods. Agile at scale is a plus.
• A drive to learn and master new technologies and techniques.
• Experience in agricultural/meat/animal industries is a plus.
• As a salaried position with the company, you may be required to travel at some point to other facilities, to attend Company events, or as a representative of the Company in other situations. Unless otherwise specified in this posting, the amount of travel may vary and the most qualified candidate must be willing and able to travel as business needs dictate.

The applicant who fills this position will be eligible for the following compensation and benefits:
• Benefits: Vision, Medical, and Dental coverage begin after 60 days of employment;
• Paid Time Off: sick leave, vacation, and 6 company observed holidays;
• 401(k): eligible after 90 days of employment including company match which begins after the first year of service and follows the company vesting schedule
• Base salary of range $115,000-$146,000 and
• Incentive Pay: This position is eligible to participate in the Company’s annual bonus plan, the amount of bonus varies and is subject to the standard terms and conditions of the incentive program;

For individuals assigned and/or hired to work in Colorado, JBS and Pilgrim’s is required by law to include a reasonable estimate of the compensation for this role. This compensation range is specific to the State of Colorado and takes into account various factors that are considered in making compensation decisions, including but not limited to a candidate’s relevant experience, qualifications, skills, competencies, and proficiencies for the role.

The Company is dedicated to ensuring a safe and secure environment for our team members and visitors. To assist in achieving that goal, we conduct a drug, alcohol, and background checks for all new team members post-offer and prior to the start of employment. It is a job expectation that all new employees are fully vaccinated against COVID-19. Those who have, at minimum, the first of a two-dose vaccine in advance of their first date of employment will be required to receive the second dose within the manufacturers recommend timeframe and submit proof of their final vaccination dose once obtained. If you need assistance in obtaining a vaccine, the Company can help schedule you with one of its healthcare partners following a conditional job offer, if one is made. The Immigration Reform and Control Act requires that verification of employment eligibility be documented for all new employees by the end of the third day of work.

About Us:
Pilgrim’s is the second largest chicken producer in the world, with operations in the U.S., Puerto Rico, Mexico and the U.K. Pilgrim’s processes, prepares, packages and delivers fresh, further-processed and value-added poultry products for sale to customers in more than 100 countries, employs more than 50,000 people and contracts with more than 5,200 family farmers. Pilgrim’s is headquartered in beautiful Greeley, Colorado, at the JBS USA corporate office where our 1,200 employees enjoy more than 300 days of sunshine a year.

Our mission: To be the best in all that we do, completely focused on our business, ensuring the bet products and services to our customers, a relationship of trust with our suppliers, profitability for our shareholders and the opportunity of a better future for all of our team members.

Our core values are: Availability, Determination, Discipline, Humility, Ownership, Simplicity, Sincerity

EOE/VETS/DISABILITY
PandoLogic.Category: Technology, Keywords: Hadoop Architect
Apply Here
For Remote Big Data Engineer – Paid Relocation to Greeley CO roles, visit Remote Big Data Engineer – Paid Relocation to Greeley CO Roles

********

Data Integration Engineer – Remote at Mindex

Location: Houston

Founded in 1994, Mindex specializes in software development for large enterprise clients in the Rochester, NY area and for New York State K-12 schools. We are a rapidly growing organization expanding heavily into cloud app. development and modernizing our home-grown K-12 Student Management System, SchoolTool.

The Data Integration Engineer is responsible for developing data integration processes to support data initiatives aimed at assuring accuracy and consistency of critical business data across systems. This position requires depth and breadth of experience in data and database technologies including proficient SQL and software development. Provides technical support during critical and non-critical business hours.

Duties and Responsibilities
• Develops an understanding of the assigned data environment through data profiling and analysis using enterprise data tools to assess quality and cleanliness of data. Identify opportunities to improve data quality.
• Using enterprise data platform and tools, develops batch and streaming integration jobs.
• Build solutions to extract, cleanse, transform, enrich, load, and validate all required data for the successful implementation of data migration.
• Documents data integration and data quality results and requirements, proposed solutions, and code to provide traceability from requirements through code implementation. Develop data monitoring solutions based on defined data quality business rules.
• Consults with infrastructure and application architects to integrate solutions into the environment to ensure consistency with architecture and standards, along with Integration Best Practices.
• Coordinates the management of data tools and platform so that software patches and software/hardware upgrades are planned and executed appropriately.
• Work with Data Management Organization and data owners to establish and progress towards data management maturity and data quality index.
• Works with capacity planning and performance testing groups to ensure that solutions perform to Service Level Agreements.
Requirements
• Bachelor’s Degree in Computer Science or equivalent work experience preferred
• Data profiling and data monitoring – Able to work on Data Quality development that includes building data profile and data monitoring jobs. (DataFlux Data Management Studio preferred)
• Data Migration: Able to build data integration and migration jobs. (DataFlux Data Management Studio preferred)
• Data Extraction and Orchestration: Able to build complex data extraction queries using SQL and automate extraction process.
• Data Cleansing and enrichment: Able to build data jobs to standardize, parse and enrich data. Able to perform data transformations. (DataFlux preferred)
• Data Modeling: Able to understand logical and physical data models.
• Continuous Integration and deployment(CI/CD): Able to understand and build CI-CD for Data Quality deliverables
Benefits
• Health insurance and telemedicine
• Paid holidays
• Paid time off
• 401k retirement savings plan and company match
• Dental insurance
• Vision insurance
• Disability insurance
• Life insurance and AD&D insurance
• Paid family leave
• Employee assistance program
• Pre-tax flexible spending accounts
• Health reimbursement account
Mindex Perks
• Tickets to sporting events
• Teambuilding events
• Holiday and celebration parties
• License to Udemy online training courses
Mindex Incentives
• Leadership training
• Professional development
• Growth opportunities
• Annual review
• Bonuses
Apply Here
For Remote Data Integration Engineer – Remote roles, visit Remote Data Integration Engineer – Remote Roles

********

Practice Manager – Data Science & Engineering (Remote role in USA) at Rackspace Technology

Location: Houston

As a full spectrum AWS integrator, we assist hundreds of companies to realize the value, efficiency, and productivity of the cloud. We take customers on their journey to enable, operate, and innovate using cloud technologies – from migration strategy to operational excellence and immersive transformation.

If you like a challenge, you’ll love it here, because we’re solving complex business problems every day, building and promoting great technology solutions that impact our customers’ success. The best part is, we’re committed to you and your growth, both professionally and personally.

Overview

As a Big Data Architect, you are passionate about data and technology solutions, are driven to learn about them and keep up with market evolution. You will play an active role in delivering modern data solutions for clients including data ingestion/data pipeline design and implementation, data warehouse & data lake architectures, cognitive computing and cloud services. You are enthusiastic about all things data, have strong problem-solving and analytical skills, are tech savvy and have a solid understanding of software development.

If you get a thrill working with cutting-edge technology and love to help solve customers’ problems, we’d love to hear from you. It’s time to rethink the possible. Are you ready?

About The Role
• Data scientist or engineer experienced with leadership experience and familiarity with machine learning, data analytics, big data processing, data lakes & warehouses, and related technologies and architectures.
• Expected to lead and manage teams of data scientists and data engineers on various data and machine learning projects.
• Required experience developing solutions on a major cloud platform, as well as demonstrated people management experience.
• Expected skills to include cloud architecture design and implementation, as well as experience with machine learning tools and libraries. Expected to have demonstrated technical leadership in the development of multiple successful production solutions.
• Expected to have demonstrated people management skills with direct reports.
• Experience with neural networks, computer vision, language processing, big data processing, or other machine learning sub-domains is desired.
• Familiarity with data engineering languages and platforms, such as SQL, Spark or Hadoop, is desired. Familiarity with managed machine learning services on a major cloud platform is also desired.
• Candidate will be expected to lead projects to solve a variety of problems by developing cloud-based solutions for customers.
• Candidate should be able to work independently, demonstrate technical leadership and people management experience, and have excellent communication skills.
About Rackspace Technology

We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies – across applications, data and security – to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future.

More on Rackspace Technology

Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.
Apply Here
For Remote Practice Manager – Data Science & Engineering (Remote role in USA) roles, visit Remote Practice Manager – Data Science & Engineering (Remote role in USA) Roles

********

Data Engineer at Robert Half Technology

Location: Houston

Description As a Data Engineer you will develop and maintain scalable data pipelines while collaborating with analytical and business teams to improve data models.

Responsibilities:?

Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.? Writes unit/integration tests, contributes to engineering wiki, and documents work.? Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.? Works closely with a team of frontend and backend engineers, product managers, and analysts.?

Defines company data assets (data models), and other jobs to populate data models.? Designs data integrations and data quality framework.? Designs and evaluates open source and vendor tools for data lineage.? Works closely with all business units and engineering teams to develop strategy for long term data platform architecture.

Minimum Qualifications:

? Preferable to have a Degree in an analytical field (e.g. Computer Science, Mathematics, Statistics, Engineering, Operations Research, Management Science) and 4+ years of professional experience.? At least 4 years of data analytics experience in a distributed computing environment? Database maintenance? Building and analyzing dashboards and reports? Evaluating and defining metrics and perform exploratory analysis? Monitoring key product metrics and understanding root causes of changes in metrics?

Empower and assist operation and product teams through building key data sets and data-based recommendations? Automating analyses and authoring pipelines via SQL/python based ETL framework? Superb SQL programming skill.? Understanding of ETL tools and database architecture.? Advanced knowledge of data warehousing.? Strong knowledge of code and programming concepts.

Experience with Python.?

Experience with Kubernetes deployments and Dev Ops approach? Highly motivated self-starter who is flexible and goal oriented? Strong Python Knowledge with Data Models, Object-Oriented Programming, Testing (Unit / Regression)? Database Experience Window Functions, Partitioning/Indexes, Relational and Non-Relational? Big Data

Experience with Hadoop, Spark , and/or Data Frame API? Performance Benchmarking Cluster Configuration/Optimization or Spark Optimzation? Version Control, CI/CD ,Git or Jenkins, Drone? Some Cloud Experience AWS (primary), Azure, Google Cloud

Requirements PySpark, Python, Apache Kafka Technology Doesn’t Change the World, People Do.®Robert Half is the world’s first and largest specialized talent solutions firm that connects highly qualified job seekers to opportunities at great companies.

We offer contract, temporary and permanent placement solutions for finance and accounting, technology, marketing and creative, legal, and administrative and customer support roles.

Robert Half puts you in the best position to succeed by advocating on your behalf and promoting you to employers. We provide access to top jobs, competitive compensation and benefits, and free online training. Stay on top of every opportunity – even on the go. Download the Robert Half app ( and get 1-tap apply, instant notifications for AI-matched jobs, and more.

Questions? ce . Robert Half will consider qualified applicants with criminal histories in a manner consistent with the requirements of the San Francisco Fair Chance Ordinance. All applicants applying for U.S. job openings must be authorized to work in the United States. Benefits are available to temporary professionals.
Apply Here
For Remote Data Engineer roles, visit Remote Data Engineer Roles

********

Senior Data Engineer – Telecommute at UnitedHealth Group

Location: Houston

Combine two of the fastest-growing fields on the planet with a culture of performance, collaboration and opportunity and this is what you get. Leading edge technology in an industry that’s improving the lives of millions. Here, innovation isn’t about another gadget, it’s about making health care data available wherever and whenever people need it, safely and reliably. There’s no room for error. Join us and start doing your life’s best work.(sm) We are looking for a Senior Data Engineer to assist in the continued development of our award-winning health analytics platform, which combines cutting-edge technology, data informatics services and cloud-based analytics tools. In this exciting role, you will collaborate with some of the smartest developers in the industry, designing and implementing solutions to some of the toughest information challenges facing health care today. We are looking for talented, innovative, effective engineers who take pride in and ownership of their responsibilities, and who can confidently communicate and champion their ideas. Youll enjoy the flexibility to telecommute from anywhere within the U.S. as you take on some tough challenges. Primary Responsibilities: Meet the Data Engineering needs of the Optum Analytics Life Sciences business as part of the Life Science Data Factory team Perform all phases of data engineering including requirements analysis, design, development and testing Design and implement features in collaboration with business and IT stakeholders Design reusable workflows, components, frameworks and libraries Design and develop innovative solutions to meet the needs of the business Review code and provide feedback to peers Research and champion new technologies Youll be rewarded and recognized for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role as well as provide development for other roles you may be interested in. Required Qualifications: 3 years of professional experience with SQL 3 years of professional experience with Python 3 years of professional experience with Spark (or PySpark) 2 years of professional experience developing solutions on AWS (Lambda, EMR, Step Functions, Glue, Athena, S3, EC2, etc.) Preferred Qualifications: Bachelor’s degree or Masters degree Experience with Scala Background in healthcare data AWS Certification To protect the health and safety of our workforce, patients and communities we serve, UnitedHealth Group and its affiliate companies require all employees to disclose COVID-19 vaccination status prior to beginning employment. In addition, some roles and locations require full COVID-19 vaccination, including boosters, as an essential job function. UnitedHealth Group adheres to all federal, state and local COVID-19 vaccination regulations as well as all client COVID-19 vaccination requirements and will obtain the necessary information from candidates prior to employment to ensure compliance. Candidates must be able to perform all essential job functions with or without reasonable accommodation. Failure to meet the vaccination requirement may result in rescission of an employment offer or termination of employment. Careers with Optum. Here’s the idea. We built an entire organization around one giant objective; make health care work better for everyone. So, when it comes to how we use the world’s large accumulation of health-related information, or guide health and lifestyle choices or manage pharmacy benefits for millions, our first goal is to leap beyond the status quo and uncover new ways to serve. Optum, part of the UnitedHealth Group family of businesses, brings together some of the greatest minds and most advanced ideas on where health care has to go in order to reach its fullest potential. For you, that means working on high performance teams against sophisticated challenges that matter. Optum, incredible ideas in one incredible company and a singular opportunity to do your life’s best work.(sm) All Telecommuters will be required to adhere to UnitedHealth Groups Telecommuter Policy. Colorado, Connecticut or Nevada Residents Only: The salary range for Colorado residents is $82,100 to $146,900. The salary range for Connecticut/Nevada residents is $90,500 to $161,600. Pay is based on several factors including but not limited to education, work experience, certifications, etc. In addition to your salary, UnitedHealth Group offers benefits such as, a comprehensive benefits package, incentive and recognition programs, equity stock purchase and 401k contribution (all benefits are subject to eligibility requirements). No matter where or when you begin a career with UnitedHealth Group, youll find a far-reaching choice of benefits and incentives. Diversity creates a healthier atmosphere: UnitedHealth Group is an Equal Employment Opportunity/Affirmative Action employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin, protected veteran status, disability status, sexual orientation, gender identity or expression, marital status, genetic information, or any other characteristic protected by law. UnitedHealth Group is a drug – free workplace. Candidates are required to pass a drug test before beginning employment. Required Preferred Job Industries Transportation
Apply Here
For Remote Senior Data Engineer – Telecommute roles, visit Remote Senior Data Engineer – Telecommute Roles

********

The Tech Career Guru
We will be happy to hear your thoughts

Leave a reply

Tech Jobs Here
Logo