Senior Scala & Big Data Engineer | Remote | Financial Trading Systems
Remotely
Full-time
You'll join our innovative financial technology team as a skilled Scala & Big Data Engineer, architecting sophisticated data pipelines that power real-time trading platforms. In this pivotal role, you'll harness cutting-edge distributed computing technologies to solve complex challenges in financial markets while working remotely with a talented global team.
Key Responsibilities:
- Design and develop robust, scalable data processing systems utilizing Scala (3.x) and Apache Spark 3.5+ frameworks for high-frequency trading applications.
- Architect end-to-end data pipelines that transform vast quantities of market data into actionable trading insights with sub-second latency requirements.
- Optimize distributed processing workflows on AWS EMR and EKS platforms for maximum throughput and cost efficiency.
- Collaborate with quantitative analysts to implement and productionize sophisticated trading algorithms and predictive models.
- Ensure data quality, consistency, and reliability across our distributed financial data ecosystem.
- Develop comprehensive automated testing frameworks for mission-critical Spark applications and data workflows.
- Create efficient ETL processes leveraging Apache Airflow 2.8+ for complex workflow orchestration.
- Implement high-performance data storage solutions incorporating HDFS, S3, and Cassandra technologies.
- Build low-latency data streaming pipelines with Apache Kafka 3.6+ for real-time market data processing.
- Document system architecture, data models, and integration interfaces for knowledge sharing.
- Participate in code reviews and establish engineering best practices within the distributed team.
- Monitor and troubleshoot production data systems in a 24/7 financial trading environment.
Required Skills:
- 4+ years of professional experience with Scala programming language and functional programming paradigms.
- Strong expertise in Apache Spark framework (3.x) for large-scale distributed data processing.
- Hands-on experience building data pipelines with Hadoop ecosystem components including HDFS and Hive 3.x.
- Proficiency with workflow orchestration tools, particularly Apache Airflow 2.x+ or similar technologies.
- Demonstrated experience designing and implementing fault-tolerant data pipelines for high-throughput systems.
- Working knowledge of distributed messaging systems such as Apache Kafka 3.x and event-driven architectures.
- Solid understanding of database fundamentals with practical experience in both SQL and NoSQL systems.
- Advanced SQL skills for complex data transformation and analysis across various database platforms.
- Extensive experience with AWS cloud services stack, particularly S3, Athena, EMR, Glue, and EKS.
- Ability to read and understand Python code written by data scientists and incorporate it into production workflows.
- Experience writing and maintaining automated tests for Spark applications and data pipelines.
- Excellent spoken and written English communication skills for effective remote collaboration.
Nice to Have:
- Prior experience in financial services, trading platforms, or investment systems.
- Familiarity with financial market data structures, trading protocols, and regulatory requirements.
- Experience with real-time stream processing frameworks like Spark Streaming, Flink, or Kafka Streams.
- Knowledge of container orchestration with Kubernetes and infrastructure-as-code practices.
- Understanding of data governance, security, and compliance requirements in financial services.
- Experience with data visualization tools and creating dashboards for business stakeholders.
- Background in machine learning workflows and MLOps in production environments.
- Experience with Scala 3 (Dotty) and its advanced type system features.
- Expertise in performance tuning and optimization of JVM-based applications.
Why Join Us:
You'll become part of a forward-thinking fintech company that's revolutionizing online trading and investment platforms. Working on intellectually stimulating big data challenges, your contributions will directly impact how financial markets operate. We offer competitive compensation packages, fully remote working arrangements, continuous learning opportunities, and collaboration with world-class engineers. Your expertise will help shape the future of financial technology while advancing your career in distributed systems and big data engineering.