About Company
Using proven crypto and blockchain technology honed over a decade, Ripples enterprise-grade solutions are faster, more transparent, and more cost-effective than traditional financial services. Our customers use these solutions to source crypto, facilitate instant payments, empower their treasury, engage new audiences, lower capital requirements, and drive new revenue. Founded in 2012, Ripple's vision is to enable a world where value moves as seamlessly as information flows todayan Internet of Value. Ripple is the only enterprise blockchain company today with products in commercial use. Ripples global payments network includes over 300 customers across 40+ countries and six continents.
Job Description
Summary
In this role, As a Software Engineer II, Data & AI you will be one of the core engineers within Ripples central Data Engineering. This team implements the data ingestion and transformation for analytics, machine learning and powering various business functions at Ripple. You are curious about the bottlenecks and failure modes of a system and look for opportunities to continually improve cost/performance characteristics. You are hands-on in driving key technical decisions, ensuring the right tradeoffs are made to deliver high-quality results and deliver high, measurable customer value. You work well across functions and teams, including data science, product, application engineering, compliance, finance and others. Your passion for good engineering is complemented by strong instincts to deliver value.
What youll do:
- Highly efficient in shipping solutions to both large and small projects.
- Can handle ambiguity in requirements and can define and propose solutions for them.
- Writes, presents, and gets agreement on the design document for a project highlighting the architecture, timelines and alternatives considered.
- Owns the development and rollout for a small to mid-sized projects.
- Writes clean tech specs and identifies risks before starting major projects.
- Recognizes trade-offs and identifies impact/risks between alternative solutions.
- Improves code structure and architecture in data pipelines of testability and maintainability.
- Plays an active role in breaking down initiatives that span multiple sprints and tasks.
- Leads feature development with 1-2 collaborators.
- Collaborate to create AI agents and conversational tools that let business users get insights from data without needing technical skills.
What you'll bring:
- Proficient( 3 - 6yrs) in at least one primary programming language (e.g. Python, Scala) and comfortable working with SQL
- Experienced in at least one Data Warehouse or data lake platforms such as Databricks
- Ability to write sophisticated code and comfortable with picking up new technologies independently.
- Some hands-on experience using AI or machine learning in data-related projects, such as helping with data pipelines, data quality checks, or simple AI-powered tools.
- Exposure to tools that use large language models, such as coding assistants, chat-based data tools, search tools that answer questions from documents, or basic workflow automation with AI.
- Familiar with developing distributed systems with experience in scalable data pipelines
- Experience with RESTful APIs and server-side APIs integration
- Highly or conceptually familiar with AWS cloud resources (S3, Lambda, API Gateway, Kinesis, Athena, etc.,)
- Experience in orchestrating CI/CD pipelines using GitLab and Terraform.
- Excel at taking vague requirements and crystallizing them into scalable data solutions
- Excited about operating independently, demonstrating perfection, and learning new technologies and framework
For positions that will be based in CA, the annual salary range for this position is below. Actual salaries may vary based on numerous factors including, among other things, an individual applicants experience and qualifications for the position. This range does not include equity or additional compensation, such as bonuses or commissions.
CA Annual Base Salary Range
$140,000$174,999 USD
Skills
- Big Data Analysis & SQL
- Clear communication
- Programming language (R, Python, Scala, Matlab)
- Teamwork skills
