Sharegain began with one question: If the largest institutions solely exercise the right to lend their stocks, bonds, and ETFs, what would it take to unlock this revenue opportunity for every investor?
Our team of experts in the UK, US and Israel built the solution: a platform that empowers online brokers, private banks, and wealth managers to offer securities lending to their clients. We call it SLaaS: Securities Lending as a Service. It’s a fully digital, customizable, end-to-end solution that automates front- and back-office operations. Institutions and investors are now free to earn more from what they own.
Every Sharegainer has their own backstory, but we all share an ambition to do things differently – bigger, better, and greater. Together we’re on a mission to democratize capital markets by building a more liquid world. The more we share, the more we all gain.
Sharegain is currently in the process of seeking a dedicated Data Engineer to become an integral member of our innovative R&D team. You will play a vital role in managing the entirety of our data processing framework and infrastructure. Your valuable contributions will not only steer data-centric decision-making but also actively contribute to the evolution of Data Center monitoring and management solutions across our organization. Within this role, your tasks will encompass the design and execution of expansive telemetry pipelines, coupled with the implementation of solutions to uphold data integrity. Your expertise will be a cornerstone in propelling our data-related endeavors forward.
- Be a primary point of contact for aspects related to data, pipelines and data warehousing
- Construct robust data integration processes and efficient ETL workflows
- Diagnose and resolve data-related challenges, ensuring the validation of result sets with exceptional quality.
- Collaborate proficiently with Azure and cloud databases
- Engage in productive collaborations with diverse Sharegain teams (R&D, architects, business) to devise and actualize tailored solutions for their unique operational needs
- Play a pivotal role in defining the architecture for monitoring and analytics solutions catering to the demands of large-scale Data Centers
- Expand the capabilities of the Data Lake by seamlessly incorporating different data sources
- Provide guidance to other teams in the application of data-driven decision-making practices
- Craft and maintain meticulous schemas and data structures, ensuring optimal data organization
- Hold a Bachelor's or Master's degree in Computer Engineering or Computer Science
- Bring over 5 years of hands-on experience in the realms of Data Engineering
- Excellent knowledge of SQL with advanced analytics data orientation- mandatory
- Very good understanding of data and schema standards and concepts- mandatory
- Advanced coding skills to develop robust ETL pipelines, related tools, and PoCs
- Exhibit very good analytical skills coupled with a flair for innovative thinking
- Possess a comprehensive understanding of problem-solving at a system level
- Thrive as a collaborative team player, seamlessly integrating with diverse R&D teams both domestically and internationally
- Excel in communication, with a knack for conveying complex ideas effectively
- Working with contemporary analytical tools and platforms like Spark, Data Bricks, and more
- Display advanced coding prowess, instrumental in crafting robust ETL pipelines, associated tools, and Proof of Concepts (PoCs)
- Experience with modern analytic tools and platforms such as Spark, Data Bricks, etc.
- Previous experience as a DBA
- Experience with NoSQL databases is a valuable addition
- Ability to prototype ideas and showcase their value
- Familiarity with cloud-native development and deployment methodologies.
- Proven expertise with the Azure cloud platform
- Background in data center design and technologies.