You will join a newly forming team charged with building our new product in the Provider Data Management space. This is an incredible opportunity to help solve a significant problem in the healthcare industry. This hire will have significant experience designing and implementing large-scale, distributed, highly available, data-driven enterprise software applications. Customer empathy and user-centric design will be a priority. The ideal candidate will be excited about working on new product development, is comfortable pushing the envelope and challenging the status quo, sets high standards for him/herself and others, and works well with ambiguity.
What you will do:
- Design and develop cloud-native, large scale, scalable, distributed, highly available, data-driven enterprise software applications
- Design and implement cloud-native applications on AWS, Azure or GCP with full DevOps lifecycle (CI, CD and CM), containerization and virtualization Design & implement a microservice-based architecture
- Design & implement best practices & design patterns like SOA
- Explore new technology solutions, innovation, evolution, and trade-offs
- Mentoring & collaborating with team members & across teams
- Ensure software products meet all non-functional requirements including operational and security needs
What you bring:
- 8+ years of experience designing and implementing large-scale, distributed, highly available, data-driven enterprise software applications
- 5+ years of experience designing and implementing cloud-native applications on AWS, Azure or GCP with full DevOps lifecycle (CI, CD and CM), containerization and virtualization
- Strong understanding of large-scale distributed systems, information and technology architectures, including EAI patterns, messaging, micro-services architecture, information models, and distributed logging/tracing.
- A good understanding of databases (relational/columnar/NoSQL), programing languages/paradigms (Java/Scala/Python, Akka/etc), caching, Enterprise Search, SOA, EDA, and CI/CD
- A solid understanding of infrastructure automation and DevOps concepts and application security
- Working knowledge of one or more Machine Learning, Analytics and Data Science tech stack(s).
- Hands-on knowledge of Stream- and batch- processing technologies.
- Excellent understanding of latest trends in distributed computing, algorithms, software technologies and development practices/tools, build and test automation, and network management.
- Experience with Java, WebLogic, Oracle RDBMS based client-server enterprise applications
-
5+ years of experience with big data technologies. Representative technologies below:
- ETL Data egress and ingress - Data Pipeline, Informatica/Talend/nifi, Sqoop/Gobblin/Marmaray, Flume
- Storage - Data Lake, Hadoop/HDFS, Elastic Search, Oracle
- Query engines - Hive/Presto/Impala
- Processing - Spark/ApacheIgnite/GridGain; Lambda
- Event-driven frameworks - Kafka
- Outstanding collaboration and communication skills
- A comfort level with challenging the status quo
- Ability to deal with ambiguity
- Empathy towards the customer
- B.S. in computer science or equivalent experience
Bonus points:
- 3+ years of Healthcare domain experience, preferably with expertise in provider data
- Familiarity with HL7 or FHIR data models