Job Description
Job Description
Join our Data Science team as an Actuarial Data Scientist where you'll architect end-to-end data applications. This role resides on the Data Science team, and you will collaborate closely with the Actuarial team. This person will build data products for actuarial users, and also build actuarial modeling into data products for underwriting, claims, and other users. Additionally, this person will work closely with actuaries, empowering them with tools to streamline and improve their current processes, as well as building new ones.
This role bridges sophisticated Python-powered Databricks data processing with engaging Python-driven Streamlit visualizations, offering a unique opportunity to revolutionize how insurance metrics become actionable intelligence.
Minimum Qualifications:
- Demonstrated knowledge of insurance business fundamentals and their implementation in data systems.
- Proficiency in Python programming with experience applying these skills to data processing, analysis, and visualization (ie. Pandas, Plotly).
- Proven ability to maintain high standards of precision in both front-end applications and back-end data pipelines, with excellent troubleshooting and debugging skills.
- Demonstrate exceptional ability to distill complex technical concepts into powerful, action-oriented executive summaries that drive decisive business action, with proven experience communicating data insights to both technical and non-technical decision-makers.
- Actuarial exam progress (especially CAS Exam 5) preferred.
- Hands-on ML experience in an insurance context preferred.
- Experience in a fast-paced startup environment preferred.
Key Responsibilities:
- Understand current state actuarial processes throughout the company performed today in SQL and Excel. Build data applications to automate current processes. Innovate to build new and better solutions, unifying best practices in actuarial science, modern data science, and technology.
- We have many analyses that use actuarial assumptions. Validate these assumptions and ensure general actuarial soundness throughout our pipelines and analyses.
- Build and enhance Python data pipelines to clean and validate data, and to transform the data through actuarial modeling. Implement robust data validation protocols to ensure integrity of the data and accuracy of the calculations and models.
- Design and maintain intuitive user-facing dashboards that transform complex data into accessible visualizations, connecting back-end data to interactive front-end reporting tools for stakeholders.
- Drive end-to-end system reliability and last-mile validation through rigorous testing methodologies and efficient bug resolution, applying meticulous attention to detail across both Streamlit interfaces and data pipelines while spending significant time troubleshooting complex issues to ensure seamless operation of all analytics tools.
- Collaborate directly with stakeholders to understand needs, build requirements, gather feedback iteratively, and present findings.
- Teach actuaries how to contribute to the data applications we’re building by training them to upskill in programming.
The above cited duties and responsibilities describe the general nature and level of work performed by people assigned to the job. They are not intended to be an exhaustive list of all the duties and responsibilities that an incumbent may be expected or asked to perform.
Additional Information:
Full benefit package including medical, dental, life, vision, company paid short/long term disability, 401(k), tuition assistance and more.
#LI-Onsite