HealthRecon Connect provides technology-enabled Revenue Cycle Management solutions to US healthcare providers. The company leverages over 30 years of deep domain expertise, machine learning, AI, cutting-edge analytics, and automated workflows that help improve cash flow, patient outcomes and enable peace of mind for their clients. At HealthRecon Connect, day after day, we not only hold ourselves accountable for setting and maintaining high standards, but we also passionately strive for the highest achievement, customer delight and thrive on the challenge of high expectations and commitment to excel.
HealthRecon was certified a Great Workplace by Great Place to Work® Sri Lanka since 2018 and was adjudged one of the 40 Best Workplaces in Sri Lanka by Great Place to Work® Sri Lanka in 2021. We are also a Signatory Participant of the United Nations Global Compact.
HRC Labs was established to lead the technological transformation of HealthRecon Connect (HRC). Propelled by the deep domain expertise and industry leading service capability of HRC, HRC Labs focus on enhancing the efficiency of healthcare delivery through intelligent automation solutions for healthcare providers. Our tools sustainably improve clients’ operating margins and cash flows by compressing their working capital cycle and reducing their administrative burden.
We are currently looking for a Associate Data Engineer to join our team focused on Revenue Cycle Management (RCM) technology automations and solutions. HRC offers you an open culture where we place importance on each individuals’ interest and learning path. We practice a flat team structure with individual ownership of projects and welcome professionals who are curious and quick on new skill acquisition. Due to the large volume of applications we receive, all applications will be reviewed in the order in which they were received and only the candidates short-listed for the first round of interviews will be contacted. Thank you for your understanding.
Associate Data Engineer
Monday to Friday
3.00pm to 12.00am Sri Lankan Standard Time (Straddle Shift)
US calendar applicable
- Designing and building scalable data storage solutions: design and build data storage systems that can handle large amounts of data, and are scalable, reliable, and secure.
- Data ingestion and processing: responsible for the ingestion of data from various sources, and for processing the data into a usable format for analysis and reporting.
- Data integration: integrate data from various sources into a centralized data storage solution, and ensure that the data is consistent, accurate, and up-to-date.
- Data warehousing: design and implement data warehousing solutions to support business intelligence and analytics initiatives.
- ETL (Extract, Transform, Load) processes: design and implement ETL processes to extract data from various sources, transform it into a usable format, and load it into a centralized data storage solution.
- Data pipeline development: design and implement data pipelines to automate the flow of data from various sources to a centralized data storage solution.
- Data quality and governance: implement data quality and governance processes to ensure that the data is accurate, consistent, and secure.
- Data security and privacy: responsible for implementing data security and privacy measures to protect sensitive and confidential data.
- Performance optimization: responsible for optimizing the performance of data storage solutions and data processing processes.
- Collaboration with data scientists and business stakeholders: work closely with data scientists and business stakeholders to understand their data requirements, and to ensure that the data storage and processing solutions meet their needs.
- HND, Bachelor in Computer Science / Software engineering or related discipline.
- 1+ years of related experience in data engineering / software engineering.
- Scripting/programming with Python, Scala, etc.
- Hands-on experience working with PL/SQL.
- Experience in working with real time data processing tools like NiFi or open-source data processing tools such as Talend.
- Hands on experience working with large volumes of Data including ETL or ELT process.
- Excellent oral and written communication skills.