Fulltime Data Engineers openings in Portland on September 23, 2022

Big Data Engineer – 100% REMOTE at Software Guidance & Assistance, Inc. (SGA, Inc.)

Location: Portland

Job Description

Software Guidance & Assistance, Inc., (SGA), is searching for a Big Data Engineer – REMOTE for a FULL-TIME assignment with one of our premier Healthcare Services clients in a REMOTE role.
• **Position based in EST ***
• Client is unable to sponsor now or in the future***
Responsibilities
• Hands-on software development in Python, SQL, Apache Spark.
• Design and build high scale, near real time and batch data processing pipelines.
• Partner with your peers in Product, Design, Scrum framework, and technology to develop our next generation data platform.
• Create and maintain technical documentation.
• Guide and deliver meaningful and thorough code reviews.
• Ensure performance, quality, and accuracy of our data.
• Qualifications
Required

Required Skills
• Bachelor’s degree or an equivalent working experience.
• 1+ Years of Apache Spark development experience (PySpark, Python, SQL).
• Share a strong passion for new technologies and driving innovation.
• Deliver high quality, production ready code, this is a high priority for you.
• Eager to always learn new technologies and expand your skillset.
• A creative and inventive software development expert. SGA is a Certified Women’s Business Enterprise (WBE) celebrating over thirty years of service to our national client base for both permanent placement and consulting opportunities. For consulting positions, we offer a variety of benefit options including but not limited to health & dental insurance, paid vacation, timely payment via direct deposit. SGA accepts transfers of H1 sponsorship for most contracting roles. We are unable to sponsor for Right-to-Hire, Fulltime, or Government roles. All parties authorized to work in the US are encouraged to apply for all roles. Only those authorized to work for government entities will be considered for government roles. Please inquire about our referral program if you would like to submit a candidate for any of our open or future job opportunities. SGA is an EEO employer. We encourage Veterans to apply. To view all of our available job postings and/or to learn more about SGA please visit us online at .
Apply Here
For Remote Big Data Engineer – 100% REMOTE roles, visit Remote Big Data Engineer – 100% REMOTE Roles

********

Sr Staff Data Platform Engineer at Mozilla

Location: Portland

Mozilla’s Data Engineering team is looking for a Senior Staff Data Platform Engineer to help build the present and future of our Data Platform infrastructure. Mozilla’s data platform serves the needs of multiple products, including the Firefox browser, from data collection through to insights, product experiences, and machine learning applications.

Our goal is to build a robust, efficient, and effective platform that makes it increasingly easy to inform Mozilla’s business and product experiences. For more details, please take a look at the overview or the Data@Mozilla blog.

Your Work Will Include
• Designing, building, and improving systems to reliably and efficiently process data at the scale of tens of terabytes per day, and applying those systems to solve meaningful problems for the business.
• Evaluation of new technologies.
• Crafting and improving workflows in GCP using standard, modern components such as Dataflow, BigQuery, PubSub, GKE, Airflow.
• Driving cross-functional projects with stakeholders across Mozilla, including Data Science, Product, Legal, Policy, and Engineering.
• Mentoring teammates, developing and improving engineering best practices across the Data Organization.
• ETL / ELT – Working with Data Science to design and implement useful data abstractions.
• Addressing evolving needs of active Data Products.
Your Experience Includes
• Data tooling in GCP, including PubSub, BigQuery, and Dataflow (Apache Beam). Bonus points if you’ve also worked with any of the following: Looker, Fivetran, Kubernetes / GKE, Airflow / Cloud Composer. We want to talk to you if you’ve got experience with similar tools on other platforms as well.
• Advanced coding skill in SQL and Python. Some of our codebase is in Java, so experience or willingness to learn Java is a plus.
• Strong software engineering fundamentals: modularity, abstraction, data structures, and algorithms.
• Tech lead, team lead, or similar leadership role.
About Mozilla

Mozilla exists to ensure that the internet is a public resource accessible to all because we believe that open and free is better than closed and controlled. When you work at Mozilla, you give yourself a chance to make a difference in the lives of web users everywhere. And you give us a chance to make a difference in your life every single day. Join us to work on the web as the platform and help create more opportunity and innovation for everyone online.

Commitment to diversity, equity and inclusion

Mozilla understands that valuing diverse creative practices and forms of knowledge are crucial to and enrich the company’s core mission. We encourage applications from everyone, including members of all equity-seeking communities, such as (but not limited to) women, racialized and Indigenous persons, persons with disabilities, persons of all sexual orientations, gender identities and expressions.

We will ensure that qualified individuals with disabilities are provided reasonable accommodations to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment, as appropriate. Please contact us at hiringaccommodation@ to request accommodation.

We are an equal opportunity employer. We do not discriminate on the basis of race (including hairstyle and texture), religion (including religious grooming and dress practices), gender, gender identity, gender expression, color, national origin, pregnancy, ancestry, domestic partner status, disability, sexual orientation, age, genetic predisposition, medical condition, marital status, citizenship status, military or veteran status, or any other basis covered by applicable laws. Mozilla will not tolerate discrimination or harassment based on any of these characteristics or any other unlawful behavior, conduct, or purpose.

Group: C

Req ID – R1978
Apply Here
For Remote Sr Staff Data Platform Engineer roles, visit Remote Sr Staff Data Platform Engineer Roles

********

Data Engineer – Specialist Senior at Deloitte

Location: Portland

AI & Data Engineering

In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment.

The AI & Data Engineering (AI&DE) team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets.

AI&DE will work with our clients to:

+ Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms

+ Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions

+ Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements

Qualifications

Required:

+ 3+ years of relevant technology consulting or industry experience with ability to contribute to an end-to-end architecture and solution overview using Google Cloud Platform (GCP) tools

+ At least one FLC implementation using GCP toolset

+ 1+years experience with SQL and Python

+ 1+ years experience leading workstreams within complex technology engagements with resources in multiple locations

+ Bachelor’s Degree or equivalent professional experience

+ Ability to travel up to 50% on average, based on the work you do and the clients and industries/sectors you serve

+ Limited immigration sponsorship may be available

Preferred:

+ Experience with Snowflake cloud data warehouse

+ Experience with Databricks

+ Experience with GCP Cloud DataFlow highly preferred

+ Experience with Spark and Java highly preferred

+ Experience with GCP to include:

+ BigQuery

+ Cloud Data Fusion

+ Cloud Pub/Sub

+ Kubernetes

+ Also, Apache Kafka and Striim

+ GCP Certifications

+ Experience with other leading commercial Cloud platforms, including Azure and AWS

+ Strong oral and written communication skills, including presentation skills (ie: MS PowerPoint)

+ Ability to create critical collaterals for client workshops and customer interactive sessions

+ Strong problem solving and troubleshooting skills with the ability to exercise mature judgment

+ An advanced degree in the area of specialization is preferred

AI&DE23

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.
Apply Here
For Remote Data Engineer – Specialist Senior roles, visit Remote Data Engineer – Specialist Senior Roles

********

REMOTE Big Data Software Engineer at Jobot

Location: Portland

Big Data software engineering opportunity with Series B consumer intelligence startup!

This Jobot Job is hosted by Kurt Holzmuller

Are you a fit? Easy Apply now by clicking the “Apply” button and sending us your resume.

Salary $100,000 – $140,000 per year

A Bit About Us

We are a globally leading series B startup in the eCommerce/consumer intelligence space. Our platform leverages robust consumer data sets and AI to provide businesses with insights into how consumers behave regarding their products.

As an engineer on our team, you will be responsible for continuing to evolve our platform and develop large-scale systems that process and act on massive amounts of data in real-time.

Why join us?
• Competitive Salary DOE
• Comprehensive Benefits Package
• 401k with match
• Generous PTO
• Bonus
• More!

Job Details

MUST HAVE
• BS in a related field OR equivalent professional experience
• 3+ years of development experience with Python or Java
• AWS
• SQL and NoSQL databases
• Distributed systems
• Spark, Hive, or Hadoop

Interested in hearing more? Easy Apply now by clicking the “Apply” button.
Apply Here
For Remote REMOTE Big Data Software Engineer roles, visit Remote REMOTE Big Data Software Engineer Roles

********

Lead Data Engineer at Amwell

Location: Portland

Brief Overview:

The Lead Data Engineer should have experience in end-to-end implementation of data-warehousing projects. This individual will manage, utilize, move, and transform data from our source system and applications data to the cloud to create reports for senior management and internal users. This individual will work both independently on assigned projects and collaboratively with other team members. The Lead Data Engineer will collaborate with architects, business users, and source (data) team members on discovering data sources and determine tech feasibility for fetching these data sources into the consolidated data environment/platform. This individual will iteratively design and build core components for our data platform, build various ETL pipelines among the various tools in play to surface data for consumption by our reporting tools, and prioritize competing requests from internal and external stakeholders in addition to keeping the reporting infrastructure on par with new product functionality and release cycles. The Lead Data Engineer will become a subject matter expert in data classification within the platform and utilize their expertise to identify the most efficient path to deliver data from source to target, as needed.

Core Responsibilities:
• Design and write excellent, fully tested code to ETL/ELT data pipelines, and stream on a cloud platform.
• Have good communication skills, as well as the ability to work effectively across internal and external organizations and virtual teams.
• Take ownership of design & development processes, ensuring incorporation of best practices, the sanity of code and versioning into different environments through tools like Git, etc.
• Implement product features and refine specifications with our product manager and product owners.
• Continuously improve team processes to ensure information is of the highest quality, contributing to the overall effectiveness of the team.
• Stay familiar with industry changes, especially in the areas of cloud data and analytics technologies.\Able to work on multiple areas, like data pipeline ETL, data modelling design, writing complex SQL queries, etc., and have a good understanding of BI/DWH principles
• Plan and execute both short-term and long-term goals individually and leading the team.
• Provide best practices and direction for data engineering and design across multiple projects and functional areas.
• Understand SDLC (Software Development life cycle) and have knowledge of Scrum, Agile.

Qualifications:
• 13+ years of development experience building data pipelines.
• Bachelor’s Degree or equivalent experience is required. Preferred in Computer Science or related degree.
• Minimum of 5 years of experience in architecture of modern data warehousing platforms using technologies such as Big Data, Cloud, and Kafka experience.
• Cloud experience – any cloud, preferably Bigquery, data flow, pub-sub, and data fusion.
• Migration experience, experience utilizing GCP to move data from on-prem servers to the cloud.
• Strong Python development for data transfers and extractions (ELT or ETL).
• Experience developing and deploying ETL solutions like Informatica or similar tools.
• Experience working within an agile development process (Scrum, Kanban, etc).
• Familiarity with CI/CD concepts.
• Demonstrated proficiency in creating technical documentation.
• Understand modern concepts (how new-gen DB is implemented – like how BQ/Redshift works?).
• Airflow, Dag development experience
• Informatica or any ETL tool previous experience.
• Ability and experience in BI and Data Analysis, end-to-end development in data platform environments.
• Write excellent, fully tested code to build ETL /ELT data pipelines on Cloud.
• Provide in-depth and always-improving code reviews to your teammates.
• Build cloud data solutions and provide domain perspective on storage, big data platform services, serverless architectures, RDBMS, and DW/DM.
• Participate in deep architectural discussions to build confidence and ensure customer success when building new solutions on the GCP platform.
• Fix things before they break.

Additional information

Should you join Amwell and the Engineering team, you can expect:

The development organization is a multi-disciplinary team of engineers dedicated to creating a state-of-the-art TeleHealth experience on every platform we can get our hands on. Our cross-functional teams follow a pragmatic Agile methodology as we balance feature requests, strategic initiatives, tech debt, and exciting partnerships on the path to delivering a market leading product to a quickly growing customer base. We work hand in hand with the whole Amwell organization to ensure that our product meets the needs of all of our users.

Working at Amwell:

Amwell is changing how care is delivered through online and mobile technology. We strive to make the hard work of healthcare look easy. In order to make this a reality, we look for people with a fast-paced, mission-driven mentality. We’re a culture that prides itself on quality, efficiency, smarts, initiative, creative thinking, and a strong work ethic.

Our Core Values include One Team, Customer First, and Deliver Awesome. Customer First and Deliver Awesome are all about our product and services and how we strive to serve. As part of One Team, we operate the Amwell Cares program, which brings needed assistance to our communities, whether that be free healthcare for the underserved or for people affected by natural disasters, support for equality, honoring doctors and nurses, or annual Amwell-matched donations to food banks. Amwell aims to be a force for good for our employees, our clients, and our communities.

Amwell cares deeply about and supports Diversity, Equity and Inclusion. These initiatives are highlighted and reflected within our Three DE&I Pillars – our Workplace, our Workforce and our Community.

Amwell is a “virtual first” workplace, which means you can work from anywhere, coming together physically for ideation, collaboration and client meetings. We enable our employees with the tools, resources and opportunities to do their jobs effectively wherever they are Amwell has collaboration spaces in Boston, Tysons Corner, Portland, Woodland Hills, and Seattle.
• Unlimited Personal Time Off (Vacation time)
• 401K match
• Competitive healthcare, dental and vision insurance plans
• Paid Parental Leave (Maternity and Paternity leave)
• Employee Stock Purchase Program
• Free access to Amwell’s Telehealth Services, SilverCloud and The Clinic by Cleveland Clinic’s second opinion program
• Free Subscription to the Calm App
• Tuition Assistance Program
• Pet Insurance
Apply Here
For Remote Lead Data Engineer roles, visit Remote Lead Data Engineer Roles

********

Sr. Software Developer- Data Engineer at BECU

Location: Portland

**SUMMARY**

The Sr. Software Engineer/ Data Engineer will lead teams writing software according to design specifications and be responsible for the bulk of the more complex development work. You will be responsible for designing and coding features, automated tests, and scripts and will also design data models, use data management tools and data governance tools. You will recommend and implement technical data management solutions for business problems, contributing to system and service design and architecture. This position will present and contribute solutions to technical leadership. The Sr. Software Engineer/ Data Engineer works closely with and mentors other data engineers, business and systems analysts, to build enterprise class data pipelines.
• *RESPONSIBILITIES**

+ Perform all responsibilities in accordance with BECU Competencies, compliance, regulatory and Information Protection requirements.

+ Responsible for development and maintenance of data pipelines on multiple platforms (cloud and onprem) using best practices that enable data curation, metadata management, master data management and implementation of data governance policies.

+ Develop prototypes, proofs of concept, and solutions by combining technical expertise with a deep understanding of software design. Deliver and design highly available and scalable services in a production environment.

+ Lead small to medium sized teams in the development and testing of system components/services, code, and design reviews.

+ Present and communicate technical topics to the larger engineering community.

+ Responsible for complex and multi-tier system analysis, design, coding, testing, debugging, and documentation.

+ Responsible for identifying code / design / structural improvement across BECU technical systems, including opportunities for greenfield development, and implementing those improvements.

+ Contribute to BECU code quality and extensibility by exampling and enforcing existing coding standards within delivery teams. Assist senior staff in creating and defining those standards.

+ Lead teams in automating and improve business processes and interactions with limited business guidance.

+ Work with Architects and product owners to design and document the team’s technology roadmap and vision.

+ Mentor junior developers / SDETs on the team via individual consulting and code reviews.

+ Perform additional duties as assigned.
• *QUALIFICATIONS**

+ Bachelor’s degree in Computer Science or related discipline, or equivalent work experience required.

+ Minimum five years of experience designing software and writing production code in a team environment required.

+ Demonstrated expertise in masterdata (MDM), metadata and data goverannce tools (Alation Data Catalog, erwin, Collibra, Azure Purview, Infosphere, Informatica Multidomain MDM, Profisee, or SAP Master Data Governance, Ataccama ONE.

+ Demonstrated expertise in cloud based and onprem data management tools (Informatica, Data Factory, Databricks, Snynapse, data storage components).

+ Deep understanding of Secure Development best practices, demonstrated by regular use of static code analysis tools to explain and correct secure coding flaws required.

+ Proven ability to deliver highly scalable solutions in multiple programming languages and technical environments over the entire product lifecycle (from ideation to retirement) required.

+ Demonstrated expertise with one of the following required: .NET, C#, node.js and modern JavaScript Frameworks in TypeScript or JavaScript (React, Angular, etc.), or Python. System administration and automation with PowerShell or bash required. Knowledge of Open Source (OSS) technologies and libraries required. Experience with public cloud (Azure/AWS/Google Cloud) technologies required. Working knowledge of containers and container orchestration in Kubernetes preferred.

+ Deep experience in one of the following domains: server-side web dev, modern client-side web dev, REST/web services, large scale data analytics using Machine Learning frameworks, networking, and service mesh required.

+ Expertise with SOLID design principles, Object-Oriented Programming and Design required. Experience with SOA patterns and distributed systems design required.

+ Expertise with Continuous Integration and Continuous Delivery systems and tools such as Azure DevOps Services, GitHub Actions, Jenkins, or Teamcity. Proficiency at building build / deployment pipelines in YAML required.

+ Deep expertise in Test Driven Development concepts, methods, and tools. Demonstrated experience in unit testing, integration testing or performance/load testing required.

+ Experience using git, including performing code reviews, pull requests, and following branching standards such as Git Flow or Trunk-Based Development required.

+ Experience presenting in front of technically adept audiences required.

+ Experience with SQL, large datasets, data warehousing and sophisticated ETL processes, analytics engines required. Knowledge of cloud-hosted SQL-based datastores, and NoSQL systems preferred.

+ Proven ability to stay current with emerging technologies and new applications of existing technologies, through work or continuing industry or education involvement required.

+ Proven experience leading and collaborating within team, including business unit teams, to deliver solutions through all aspects of the SDLC required.
• *EEO Statement:**

BECU is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, veteran status, disability, sexual orientation, gender identity, or any other protected status.
Apply Here
For Remote Sr. Software Developer- Data Engineer roles, visit Remote Sr. Software Developer- Data Engineer Roles

********

SW Data Engineer at Resume Library

Location: Portland

We are unable to work with 3rd-party or corp-to-corp candidates for this position

SW Data Engineer

Note: This is a remote 12-month temporary assignment with the possibility of extension. Our client will consider candidates from any location in the United States. Open for direct applicants only.

Position Summary

In the Software Data Engineer role, you will work with the Software Engineering team for a product in the district division, you love data consistency, quality, accuracy, and integrity and are passionate about data infrastructure i. If this sounds like the work you enjoy, let us know

The Software Data Engineer will focus on helping build high quality reliable, accurate, consistent, and architecturally sound systems that are aligned with our data business needs. They may be called upon to fulfill functions related to software development that are not specifically software programming, such as requirements gathering, producing documentation, quality assurance testing, data validation, and leading feature development.

Your Next Challenge…

Participate in the design, prototype, and implementation of cloud data platforms data processing, data transformation, orchestration, and related applications

Identify ways to improve data reliability, efficiency, and quality in the pipeline

Support and advise internal teams on how to integrate multiple sources of data into the data platform

Collaborate with quality engineers to resolve software defects, advise on building test automation and capture data quality metrics

Collaborate with project architects and assist team members to prove the validity of new software technologies

Traits for Success…

Demonstrated ability to follow through with all tasks, promises and commitments

Ability to communicate and work effectively within priorities

Ability to advocate ideas and to objectively participate in design critiques

Ability to work under tight timelines in a fast-paced environment

Ability to build strong customer relationships and deliver customer-centric solutions

Ability to solve complex problems

Competencies

Instill trust: gain the confidence and trust of others through honesty, integrity, and authenticity

Communicate effectively: develop and deliver multi-mode communications that convey a clear understanding of the unique needs of different audiences

Customer focus: build strong customer relationships and deliver customer-centric solutions

Drive results: consistently achieve results, even under tough circumstances

Education and Experience

Minimum Bachelor’s degree in computer science or a related field; or an equivalent combination of education and experience will be considered in lieu of a degree

Minimum 2-4 years of progressive experience in a software development environment in high growth technology companies

Proven experience with multiple completed projects with significant, clearly attributable individual design and implementation contributions.

Proven experience working across multiple tiers of an application, including a database, network, operating system, and containers.

Experience working in an AGILE environment

Fluency in SQL and database technologies

Fluency in one or more programming languages including , Angular, Java/Python

Demonstrable knowledge of AWS and Data Platform experience: S3, Kinesis, Dynamo, RDS,

Demonstratable knowledge in data quality test frameworks
Apply Here
For Remote SW Data Engineer roles, visit Remote SW Data Engineer Roles

********

Senior Machine Learning Engineer at Gemini

Location: Portland

Empower the Individual Through Crypto

Gemini is a crypto exchange and custodian that allows customers to buy, sell, store, and earn more than 30 cryptocurrencies like bitcoin, bitcoin cash, ether, litecoin, and Zcash. Gemini is a New York trust company that is subject to the capital reserve requirements, cybersecurity requirements, and banking compliance standards set forth by the New York State Department of Financial Services and the New York Banking Law. Gemini was founded in 2014 by twin brothers Cameron and Tyler Winklevoss to empower the individual through crypto.

Crypto is about giving you greater choice, independence, and opportunity. We are here to help you on your journey. We build crypto products that are simple, elegant, and secure. Whether you are an individual or an institution, we want to help you buy, sell, and store your bitcoin and cryptocurrency. Crypto is not just a technology, it’s a movement.

At Gemini, our mission is to empower the individual and that includes giving our employees flexibility of choice – our Office Optional Policy allows employees to choose to work from one of our physical locations or from home.

Select roles that are location-specific will still be eligible for flexible schedules.

The Department: Data

Data is central to all of our business functions and drive many of our most important decisions at Gemini. As a result, the analysts, scientists, and engineers that make up the Data team are uniquely positioned to advise and influence cross-functional projects. These projects cover a wide-range of topics including product strategy, user acquisition and journey, cryptocurrency performance and project development, and crypto’s position as a technology, industry, and asset class within the context of the global financial markets at large.

The Role: Senior Machine Learning Engineer (Python)

The Data Science team and AI stack are looking for senior data scientists and machine learning engineers who are interested in solving diverse and complex business problems. You will leverage your experience and communication skills to work across business teams to build and develop innovative machine learning models and algorithms.

Responsibilities
• Be able to distill complex models and analysis into compelling insights for our stakeholders and executives
• Ensure data quality throughout all stages of acquisition and processing, including such areas as data sourcing/collection, ground truth generation, normalization, transformation, cross-lingual alignment/mapping, etc.
• Develop new tools and solutions to enable stakeholders to consume and understand data more intuitively
• Stay up-to-date with data science tools and methodologies in technology and financial domain
• Knowledge of probability and statistics, including experimental design, predictive modeling, optimization, and causal inference. Experience in design and deployment of real-world, large-scale, user-facing systems
• Manage your own process: identify and execute on high impact projects, triage external requests, and make sure you bring projects to conclusion in time for the results to be useful
Qualifications
• Masters or PhD(preferred) in Statistics, Applied Math, Computer Science or related fields
• 4+ years of work experience in analytics and data science domain focusing on business problems
• 3+ years of experience deploying statistical and machine learning models in production
• 2+ years of experience in integrating data science models into applications
• Skilled in programming languages like Python, Java/C++/C# and SQL
• Sound knowledge in dealing with large data sets for analytical approach and quantitative methods
• Solid understanding of machine learning fundamentals, and familiar with standard algorithms and techniques
• Good understanding of deep learning algorithms and workflows
• Ability to analyze a wide variety of data: structured and unstructured, observational and experimental, to drive system designs and product implementations
• Good understanding of cloud computing and infrastructure concepts
• Experience with one or more big data tools and technologies like Snowflake, Databricks, S3, Hadoop, Spark
• Experience working with NLP applications is a plus
• Experience with financial markets and/or retail finance is a plus. Specifically, across market making, trading, risk management, orderbooks, portfolio management, credit cards and fraud.
• Strong technical and business communication
It Pays to Work Here

We take a holistic approach to compensation at Gemini, which includes:
• Competitive Compensation and Profit-Sharing Equity
• Flexible vacation policy
• Retirement Plan Matching
• Generous Parental leave
• Comprehensive health plans
• Training and professional development
At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace and affirmative action employer. If you have a specific need that requires accommodation, please let a member of the People Team know.
Apply Here
For Remote Senior Machine Learning Engineer roles, visit Remote Senior Machine Learning Engineer Roles

********

Senior Software Engineer – Python – REMOTE at Jobot

Location: Portland

Remote, Impactful, Mission-Driven, Equity!

This Jobot Job is hosted by Dylan Currier

Are you a fit? Easy Apply now by clicking the “Apply” button and sending us your resume.

Salary $140,000 – $170,000 per year

A Bit About Us

We are a HealthTech company making sure that are patients have the best possible care possible while ensuring that the costs remain affordable. We understand that many people end up waiting countless dollars on medical exams, prescriptions, and hospital time, so our goal is to prevent this from happening.

We have impressive investors, a diverse culture, and a belief in work life balance. After all, our goal is to improve the life’s of those who need help. We want to make sure you remain healthy both physically and mentally!
• Candidates must be authorized to work in the United States. Unfortunately, we cannot provide sponsorships or visa transfers.

We are headquartered out of SF and you are welcome to come into the office, but it is not mandatory.

Why join us?

As one of our team members, you will receive the following
• Competitive base salary
• Benefits for you and your family
• Significant Equity
• Generous PTO
• Remote-Flexibility (within the US)

Job Details

As a Senior Software Engineer, you will be responsible for the following
• Develop API’s and Web apps in Python
• Build and deploy services to AWS
• Perform QA and Unit Tests for the code that you write
• Work with frontend and data engineers cross funtionally

We are looking for someone with the following skills
• Profiency in Python
• Prior experience with different AWS services
• Experience working with both Relational and NonRelational databases
• Strong communication skills

Interested in hearing more? Easy Apply now by clicking the “Apply” button.
Apply Here
For Remote Senior Software Engineer – Python – REMOTE roles, visit Remote Senior Software Engineer – Python – REMOTE Roles

********

Software Engineer (software/hardware) REMOTE! at Jobot

Location: Portland

Cloud based data centralization for complex hardware engineering!

This Jobot Job is hosted by Mallory Calloway

Are you a fit? Easy Apply now by clicking the “Apply” button and sending us your resume.

Salary $100,000 – $150,000 per year

A Bit About Us

Cloud based data centralization for complex hardware engineering! We are building the revolutionary platform that changes how drones, self-driving cars, etc are being designed, built, integrated, tested, and deployed.

Why join us?

WFH

In person retreats

Top tier benefits

Job Details
• *must be US citizen**

Responsibilities…

Build integrations with software tools utilized across the engineering lifecycle, such as requirements management, modeling and simulation, product lifecycle management, project management, operations, and purchasing

Translate product specifications, designs, and wireframes into high-quality code

Consistently seek to enhance efficiency, stability, and scalability of our software

Collaborate with stakeholders across the organization such as product, operations, infrastructure, and security leads

Requirements…

Looking for Frontend, Backend, and Full stack candidates.

3+ yrs developmental experience

Hands-on experience with with modern technologies like React, JavaScript/Node.JS, Java/Kotlin/Go, Postgres, AWS, and RESTful APIs

Experience with hardware programs with spacecraft, launch vehicles, aircraft, robotics, consumer or medical devices

Experience with data modeling/data structures

Interested in hearing more? Easy Apply now by clicking the “Apply” button.
Apply Here
For Remote Software Engineer (software/hardware) REMOTE! roles, visit Remote Software Engineer (software/hardware) REMOTE! Roles

********

The Tech Career Guru
We will be happy to hear your thoughts

Leave a reply

Tech Jobs Here
Logo