Freelance Data Engineer

Contract/Freelance Freelance/Contract, United States


Description

The Motley Fool is looking for a highly skilled Freelance Data Engineer to join our team on an independent contract basis, 30-40 hours per week for approximately 6 months. This is a mid to senior level position and requires 4-5+ years of relevant experience.

 

This role is 100% remote, but candidates MUST reside in the United States to be considered. 

Who are we?

 

We are The Motley Fool, a purpose-driven financial information and services firm with nearly 30 years of experience focused on making the world smarter, happier, and richer. But what does that even mean?! It means we’re helping Fools (always with a capital “F”) demystify the world of finance, beat the stock market, and achieve personal wealth and happiness through our products and services.

 

The Motley Fool is firmly committed to diversity, inclusion, and equity. We are a motley group of overachievers that have built a culture of trust founded on Foolishness, fun, and a commitment to making the world smarter, happier and richer.  However you identify or whatever winding road has led you to us, please don't hesitate to apply if the description above leaves you thinking, "Hey! I could do that!"

 

What does this team do?

 

The Data Engineering team at The Motley Fool creates data pipelines to wrangle data from around the Fool. We collaborate with everyone - from third party vendors to stakeholders to build easily consumable data structures for reporting and business insights. While working closely with our business analysts and machine learning specialists, we serve the data needs of all The Motley Fool Teams!

 

What would you do in this role?

 

As a Freelance Data Engineer, you will be responsible for expanding and optimizing data, the data pipeline architecture, the data flow, and collection for cross-functional teams. You are an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. You will help to guide and support our software developers, database architects, data analysts, and data scientists on business initiatives while ensuring optimal data delivery architecture is consistent. Whether it’s working on a solo project or with the team, you are self-directed and comfortable supporting the data needs of multiple teams, systems, and products.

 

But What Would You Actually Do in this role?

 

        Leverage data assets to meet mission needs, ensuring consistent data quality, establishing data standards and governance

        Work in an agile, collaborative environment, partnering with client stakeholders, to develop and improve mission-based solutions

        Monitor cloud-based systems and components for availability, performance, reliability, security and efficiency

        Create and configure appropriate cloud resources to meet the needs of the end users.

        As needed, document topology, processes, and solution architecture.

        Assist with the training and enablement of data consumers.

        Share your passion for staying on top of tech trends, experimenting with and learning new technologies

 

Required Experience:

 

        5+ years data modeling in a data warehouse
        3+ years with Python
        2+ years of experience with the AWS services.
        Experience with cloud data storage and compute components including lambda functions, EC2s, containers.
● Strong problem-solving skills and a proven ability to apply critical/analytical thinking to deliver sustainable and creative solutions to complex requirements.

        Ability to work independently, and deliver results and drive projects with minimal supervision
        Strong ability to communicate blockers and issues to management for escalation and timely resolution
        Strong team player, with desire to learn new skills and broaden experience
        Experience working with complex data sets
        Experience in migrating from on-prem to cloud systems

 

Preferred Qualifications:

 

        2+ years with Snowflake

        2+ years with Apache Spark

        1+ years with Apache Airflow

        Experience working with Financial data

 

Compensation:

The budget for this role is $90-$115 per hour depending on experience.