Lextegrity is an innovative reg tech SaaS startup disrupting a large market by pioneering new ways of delivering compliance technologies-as-a-service. Its platform helps multinationals eliminate fraud, bribery and corruption in their global operations. The company already has a leading product used by large multinationals across industries and has been featured by leading press as well as regulators. We are venture-backed with key industry leadership and subject matter expertise.
We are looking for a Senior Data Engineer with a strong background in data design and data architecture to join our data engineering team and help define our product’s evolution.
This role is focused on our innovative data monitoring platform that identifies ‘risk’ in a company's operations. You will be building scalable data processing frameworks to use across clients and implementing it in data pipelines to process data from a diverse set of data sources.
Day-in-and-day-out this is a hands-on Python focused role that requires significant experience in data management & manipulation. The core technical skills of this role are Python, SQL, Apache Airflow, Spark (PySpark & SparkSQL). Experience with AWS services (such as EMR), database management in MySQL / Postgres, data pipeline / system design are helpful.
The ideal candidate is someone who has strong data architecture expertise, diverse database experience, and understands how to efficiently and effectively coordinate data preparation tasks. Past work would ideally include experience with data pipelines, architecting data systems, and identifying / implementing optimal data processing & storage approaches.
- Build data pipelines in Apache Airflow / Spark / Postgres for our core product. Build frameworks for creating data pipelines in Apache Airflow / Spark / Postgres for our core product.
- Ensure integrity and security of data.
- Create internal analytics frameworks, tooling, and approaches for use in understanding our systems & researching client data.
- Research and build efficient and scalable data storage and retrieval systems that enable interactive reporting on high dimensional data.
What we need:
- 4+ years of experience building data pipelines, data analytics tools, and visualizations using Python / SQL / Airflow.
- Strong Python skills. Experience with Python packages Pandas, NumPy, boto3, Jinja, etc are a plus.
- Highly versed in SQL across multiple databases (Postgres, T-SQL, SparkSQL, etc). SQL familiarity with window / analytical functions, stored procedures, etc.
- Significant experience with Airflow (preferred) or other ETL orchestration tooling (AWS Glue, DBT, etc) to build data pipelines.
- Demonstrated Apache Spark skills (writing Spark applications, SparkSQL, etc.).
- Experience with using AWS services ( EMR, S3, EC2, RDS, MWAA) to build products.
What we’d really love:
- An entrepreneurial mindset and ability to envision and build the tools needed to move our platform to the next level.
- Familiarity with financial systems integrations (SAP, Concur, Oracle, etc.) a plus.
- Experience working with financial transactions, PII, or in a regulated industry is a plus.
- Work with risk / fraud detection or similarly focused analytics is a plus.
- Mission driven culture, with passion for our customers and the problem we are solving
- Competitive compensation and ownership in the business
- Strong benefits including health, vision, dental, disability and life insurance
- Focus on team, with respect for the individual. Flexible work from home options and unlimited PTO
- Regular company and team-building events
- Emphasis on diversity and inclusion
Let us know
Help us maintain the quality of jobs posted on RemoteTechJobs and let us know if:
Job Description:This position is Remote. Candidates will be considered based on their qualifications and the candidate's ability to work from an approved work location. Although the location is flexible, AEP does not have a presence in all states and.
ResponsibilitiesProvide clinical data management support to clinical development franchise(s) and related Clinical Operations team and/or study project, Safety, Clinical Management team and Biostatistics.Participate in the review of clinical research.
ABOUT THIS JOB:Data and Analytics Engineering - Platform team is critical to making our organization's data strategy successful. You will play a key role in architecting and engineering modern, adaptive, data-driven, and secure platforms and processes.
Job Description:Information TechnologyThis is for AEP (American Electric Power) Commercial Operations. The ideal candidate will have will have experience in all phases of software development: gathering user requirements, making architecture decisions,.
**Please note we are no longer accepting applications**Company BackgroundINTO University Partnerships collaborates with leading universities to provide international students with a personalized and highly supportive learning environment in which to.