Specialist Software Engineering – Actuarial Systems Lead
Responsibilities
Lead analysis and complex architectural design across platforms to deliver endtoend solutions; translate architectural guidance into detailed technical designs for AWS-based & On Prem Solutions.
Effectively design, develop and deliver solutions using Python, Pyspark, On Prem and AWS cloud technologies like Ec2, S3, Redshift, Glue, EMR, ECS/Fargate, Lambda, API Gateway, CloudWatch, etc.
Lead committees and strategic projects to define organizationwide standards, processes, and data/cloud governance for financial data, ensuring security and compliance with IAM, KMS, CyberArk, and related controls.
Collaborate in an agile environment with development staff (including contract and offshore) and actuarial/finance partners to translate models into scalable, testable solutions.
Code, test, implement, and document cloud, data engineering, and AI/GenAI solutions using Python, PySpark, and frameworks such as RAG, LangChain, Copilot Studio, and Q Developer for valuation, pricing, ALM, and risk analytics.
Perform design and analysis for deliverables and lead largeteam design sessions, including data lake/warehouse patterns and integration of onprem systems (SQL Server, SSIS, Oracle) with cloud platforms.
Serve as a subject matter expert across applications, data platforms, and AI/ML engineering, including orchestration with Airflow/Glue and eventdriven architectures.
Recommend solutions that balance cost, business needs, reliability, and system impacts; optimize workloads across AWS and hybrid environments.
Leverage thirdparty frameworks, opensource libraries, and APIs; manage CI/CD pipelines (Jenkins, Git, Docker) and infrastructureascode (CloudFormation/Terraform) to accelerate delivery.
Lead and mentor development staff (employees, contract, and offshore); uphold engineering excellence, code quality, and model validation practices.
Continuously learn and adopt advances in cloud, data, and AI technologies to enhance the actuarial platform and delivery effectiveness.
Prepare presentations and lead development meetings; clearly communicate architecture, roadmaps, and outcomes to technical and actuarial/finance stakeholders.
Qualifications
- Bachelor’s degree in computer science, information systems, math, engineering, or other technical field, or equivalent experience
- Six years of Python and/or PySpark experience building dataintensive applications and model computation services
- Expertise in coding platforms/frameworks (e.g., Python, PySpark, SQL, OOD/OOP, functional programming, FastAPI/Flask, service design, core architecture)
- Five years of experience and expertise in database design techniques and philosophies (e.g., RDBMS, star schema/Kimball, dimensional modeling; performance tuning)
- In-depth knowledge of NoSQL database technologies (e.g., Amazon DynamoDB, Glue Catalog, Parquet, partitioning)
- Three years of development experience with AWS cloud services (e.g., EMR, Glue, S3, Redshift, Athena, ECS/Fargate, EC2, RDS, Lambda, SQS, API Gateway, CloudWatch, Secrets Manager, KMS, IAM)
- Expertise in build and deployment tools - (Git/Bitbucket, Jenkins, artifact repositories; Python packaging and testing)
- Expertise in developing distributed computing and/or grid workloads (e.g., Spark/EMR, parallelized actuarial computations; Databricks optional)
- Five years of experience with integration and service frameworks (e.g., API Gateways, Kafka/SQS/SNS, Swagger/OpenAPI, microservices)
- Expertise with Microservices and REST based API development (e.g. Spring Boot, Spring MVC, Entity Framework, IIS, Swagger, Odata, .NET API 2, .NET API Core, AutoMapper)
- Experience leveraging leveraging CI/CD and containers (e.g., Jenkins, Docker, Kubernetes or ECS/Fargate, and container automation) in a CI/CD pipeline
- Familiarity with modern frontend development frameworks ( e.g., React/Angular, Power BI, Amazon Quick Sight)
- Advanced understanding of software development and research tools
- Attention to detail and results oriented, with a strong customer focus
- Ability to work as part of a team and independently
- Analytical and problem-solving skills
- Technical communication skills and the ability to present information to all levels of the organization
- Problem-solving and technical communication skills
- Ability to prioritize workload to meet tight deadlines
Preferred Qualifications
- Master’s degree
- Understanding of advanced analytics and machine learning concepts and technology implementations
- Experience with big data and real time streaming analytics processing architecture
- Experience with data warehousing architecture and implementation, including source to target mappings and ETL.
- Technology or platform certifications (e.g. AWS, Microsoft)
- Knowledge of the financial services industry
Working Conditions
- Hybrid office environment 3 days a week (Tuesday, Wednesday, Thursday)
- Work outside of normal business hours may be required
- Moderate travel
Compensation
The Salary for this position generally ranges between $125,000 - $160,000 annually. Please note that the salary range is a good faith estimate for this position and actual starting pay is determined by several factors including qualifications, experience, geography, work location designation (in-office, hybrid, remote) and operational needs. Salary may vary above and below the stated amounts, as permitted by applicable law.
Additionally, this position is typically eligible for an Annual Bonus based on the Company Bonus Plan/Individual Performance and is at the Company’s discretion.
Applicants must be authorized to work for any employer in the U.S. We are unable to sponsor or take over sponsorship of an employment Visa at this time.
This job description is not a contract of employment nor for any specific job responsibilities. The Company may change, add to, remove, or revoke the terms of this job description at its discretion. Managers may assign other duties and responsibilities as needed. In the event an employee or applicant requests or requires an accommodation in order to perform job functions, the applicable HR Business Partner should be contacted to evaluate the accommodation request.