Principal Data Engineer at Authentica Solutions #vacancy #remote

Principal Data Engineer

About Authentica Solutions

Company Overview:

Authentica Solutions is a leading EdTech organization that aims to reimagine the education sector by providing innovative solutions that empower educators, students, and institutions. We are dedicated to creating a holistic and data-driven educational ecosystem, and we are seeking a skilled Senior Data Engineer to play a pivotal role in advancing our Education Intelligence Platform. This platform leverages IPaaS (Integration Platform as a Service), complex data transformation, data cleansing, standards alignment, intricate privacy requirements, and Data Lake technologies to solve complex problems in the EdTech landscape, enhancing our partnerships and driving data-informed decision-making.

Our Customers:

Architecting a platform that delivers capabilities as self-service and driven for needs known and unknown provide this role an opportunity to reimagine middleware making it miracle-ware that services EdTech Software Providers (ISVs), System Integrators, Resellers, Consulting Services Partners, Strategic Partners, Departments/Ministries of Education, K12 School Districts, and Higher Education organizations, globally. With a priority on third party purchase and use, our platform must solve the most complex data challenges in Education empowering partners to serve their customer’s needs fast, simple, and with consistency.

Responsibilities

As a Principal Data Engineer, you will be a leader in our engineering organization with primary responsibility over our data platform. You will guide the existing engineering team, responsible for designing, developing, and maintaining our Education Intelligence Platform. Your expertise will be crucial in ensuring the scalability, reliability, and performance of the platform, enabling us to drive true impact for our partners and stakeholders.

You will play a key role in constructing, managing, and enhancing fault-tolerant data infrastructure while upholding the highest standards of data quality and integrity.

  • Collaborating with the engineering and product teams to execute on product goals.
  • Developing and managing fault-tolerant, scalable data pipelines capable of handling terabytes of data using distributed cloud technologies.
  • Develop data ingestion, processing, and transformation techniques to ensure data integrity and quality.
  • Oversee the maintenance of our practices and processes for testing and code promotion.
  • Conducting POCs to validate new tools and services that enhance our data engineering solutions and products.
  • Troubleshooting production data quality issues and ensuring data integrity.
  • Staying abreast of industry standards and technological advancements to continually improve our engineering output.

About You (or Here’s What We’re Looking For)

  • Growth mindset, insatiably curious, always learning, and welcoming challenges for the opportunity to grow.
  • You believe that you can only be successful when the whole team is successful, and you put your efforts towards it.
  • Has a keen interest in the educational sector and the impact technology can have on it.
  • Ability to bring innovative technical solutions ideas to solve real problems.
  • Strong verbal and written communication skills.

Required Skills and Experience

Must Have

  • Minimum of 10 years hands-on experience with Python and related data libraries (e.g. Pandas, Data Frames) Practical expertise in ETL/ELT technologies and methodologies.
  • Proven experience in data wrangling and cleaning across structured, semi-structured, and unstructured data formats.
  • Solid design and development background in modern technologies such as API management, REST/API integration, Containers, and Micro services.
  • Experience in designing or working with data warehouses, including an understanding of associated data flows.
  • Exceptional communication skills, both written and verbal.
  • English fluency is required to effectively communicate with our clients and other key stakeholders both internal and external.

Nice to Have

  • Familiarity or experience with orchestration tools.
  • Knowledge or hands-on experience with streaming platforms and real-time data pipeline systems such as Apache Kafka, Azure Event Hubs, and similar platforms.
  • Experience working with big data storage technologies like Azure Data Lake, Hadoop, BigQuery, and similar tools.
  • Education Sector experience.

Join Authentica Solutions and be part of a dynamic team that’s shaping the future of education through data-driven insights. If you are passionate about solving complex problems, building scalable data platforms, and making a positive impact on education, we encourage you to apply! If you don’t meet all the qualifications, but feel you are a strong candidate overall, please apply and tell us why.

Authentica Solutions is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation, national origin, genetics, disability, age, or veteran status. We provide a workplace free from discrimination and harassment, and where employees are treated with respect and dignity. Our employment decisions are based on business needs, job requirements, and individual qualifications.

We encourage candidates from all backgrounds to apply, as we believe a diverse workforce brings a variety of ideas, perspectives, and experiences that enhance our ability to meet the needs of our customers and drive innovation.

Applicants must be authorized to work for any employer in the United States. We are unable to provide sponsorship or assume responsibility for employment Visa sponsorship.

REST API pandas data-warehouse Python docker-containers Apache Kafka azure-data-lake ELT Azure EventHubs ETL google-bigquery Hadoop microservices

Залишити відповідь