Description:
This role requires strong technical abilities, adaptability, inventiveness, business acumen, and strong communication skills. Sometimes you will work with clients on traditional big data technologies such as SQL data warehouses and on-premise Hadoop data lakes, while at other times you will be helping them to discover and implement the most cutting-edge tools; cutting-edge infrastructure deployment stacks, cloud-based elastic compute engines like Spark or Kubernetes, and deep learning on GPUs. If you are excited about staying at the bleeding edge of big data and AI while maintaining a strong working knowledge of existing enterprise systems, this will be a great fit for you.
How You'll Make An Impact
- Evangelize the challenges of building Enterprise Data Science Platforms to technical and non-technical audiences.
- Understand customer requirements in terms of scalability, availability and security and provide architecture recommendations.
- Deploy Dataiku in a large variety of technical environments (on prem/cloud, Kubernetes, Spark, Hadoop, etc).
- Design and build reference architectures, decks, scripts and diagrams to make the deployment and maintenance of Dataiku smooth and easy.
- Coordinate with Revenue and Customer teams to deliver a consistent experience and message to our customers.
- Train our clients and partners in the art and science of administering a bleeding-edge Elastic AI platform.
- Drive technical success by being a trusted advisor to our prospects and our internal account teams to provide the best recommendations and advance customer accounts.
- Troubleshoot complex customer issues when necessary.
What You'll Need To Be Successful
- Strong Linux system administration experience.
- Hands-on experience with the Kubernetes ecosystem for setup, administration, troubleshooting, and tuning.
- Experience with cloud based services like AWS, GCP, and Azure.
- Grit when faced with technical issues. You don’t rest until you understand why it does not work.
- Comfort and confidence in client-facing interactions.
- Hands-on experience with the Hadoop and/or Spark ecosystem for setup, administration, troubleshooting, and tuning.
- Some experience with Python.
- Familiarity with Ansible or other application deployment tools (Terraform, CloudFormation, etc).
- Experience in a client facing role as well as experience working with system integrators