Immuta is a data management platform purposefully built to accelerate the development of machine learning and AI. The world's largest enterprises rely on our software to enable their data scientists, data owners, and governance teams to work together, in a single digital platform, to access data faster and in a legal and ethical manner, lowering business risk and bridging the gap between the law and AI. We are always on the lookout for world-class talent to join the team and we recognize that our employees are the heart and soul of our solutions. We focus on recruiting the most talented people, treating you right, and allowing you to do what you do best! Smart people want to work with smart people, and we love people who are passionate about what they do and are driven by identifying ways to do it better. Our People-centered culture ensures that your personal and professional interests are both supported and celebrated!
This is a position for an experienced software engineer to directly contribute to the Immuta product. This role requires an individual that can manage both big picture and fine grained tasks - mature enough to take a user story and work with the team on an architectural design through to the implementation and testing of the features. More specifically, this role requires experience contributing to distributed systems software, for example, Apache Hadoop software products such as HDFS, HBase, Spark, and Impala, to name a few. This role will be critical to expand Immuta’s first class support for transient cloud workloads where distributed processing and separating compute from storage is critical. You will be part of a talented team that values everyone’s input and creativity. Allowing you to explore creative solutions, the freedom to manage your time as a mature contributor, and opportunities to speak and provide thought leadership through conferences, publications and customer engagements.
WE'RE LOOKING FOR SOFTWARE ENGINEERS WHO…
- have experience contributing production code to distributed data processing framework software, such as HDFS, Spark, Impala
- are outstanding problem solvers and can tackle tough challenges with innovative thinking and determination.
- have excellent communication skills and can effectively discuss technical design.
- have a deep understanding of at least one programming language and associated libraries and have a willingness to learn more.
TECHNOLOGIES YOU'LL USE
- Analytic experience
- Distributed data processing frameworks experience (for example, HDFS, Spark)
- SQL expertise
- Docker expertise
- Linux systems knowledge and command line comfortability
- Full-stack troubleshooting
- Python or R experience
- IaaS familiarity