NEC Software Solutions (India)
On 1st July 2021, Rave Technologies became NEC Software Solutions India. This change brought us under the global NEC Corporation brand. We are proud to be part of an organisation with 122 years of experience in evolution with technology and innovation.
We have more than 30 years of experience in providing end to end IT services across the globe and have earned a reputation for delighting our customers by consistently surpassing expectations and helping them deliver robust, market-ready software products that meet the highest standards of engineering and user experience. Supported by more than 1300 exceptionally talented manpower, we are a hub for offshore support and technology services.
We work with diverse industry verticals which include publishing, media, financial services, retail, healthcare and technology companies around the world. Our customers range from two-person startups to $bn listed companies.
For more information, visit at www.necsws.com/india.
About NEC Corporation
NEC Corporation is a Japanese multinational information technology and electronics company, headquartered in Tokyo, Japan. It is recognised as a ‘Top 50 Innovative Company’ globally and the NEC Group globally provides “Solutions for Society” that promote the safety, security, fairness and equality of society. Their main goal is to help create a safer society with their innovations in technologies.
NEC Corporation has established itself as a leader in the integration of IT and network technologies while promoting the brand statement of “Orchestrating a brighter world.” NEC enables businesses and communities to adapt to rapid changes taking place in both society and the market as it provides for the social values of safety, security, fairness and efficiency to promote a more sustainable world where everyone has the chance to reach their full potential.
For more information, visit NEC at https://www.nec.com.
This is a Data Engineering position with the Data Science and Engineering Technologies team. The candidate must be able to analyze, design, develop, integrate, run, and support ETL and Data-related jobs across applicatons and Data Warehouses, using a mix of technologies and architectures, application servers, databases, logs, and APIs. The primary platform for this position will be Talend and Snowflake.
The candidate will be required to:
- Develop extract-transform-load (ETL) jobs in Talend and demonstrate experience in Talend Studio, Talend Management Consol, GIT.
- Demonstrate proficienciy in Snowflake in regard to designing SQL for ELT operations, writing Snowflake stored procedures and SQL scripting. The Data Engineer will also work with data architects to write and deploy DDL change scripts. Knowledge of Python via Snowpark API is a plus.
- Troubleshoot data and processing errors using the features that both Talend and Snowflake make available, such as logs, statement history and Snowflake’s Time Travel.
- Work with Talend jobs that Integrate AWS S3 data files. Experience encrypting and decrypting files using Talend built-ins is desirable.
- Create Talend input and output components that call REST APIs.
- Read and understand technical specifications provided by business analysts and team technical leads and
- Demonstrate strong skills in advanced SQL and become familiar with the differences in and strengths of SQL dialects for Snowflake, Big Query, Oracle and SQL Server.
- Create ETL extraction logic from different types of interfaces, such as APIs, web services, external and on-premises databases and warehouses.
- Thoroughly test ETL pipelines and data enrichments for accuracy and performance.
- Troubleshoot and fix major system problems when they arise in core and supplemental data systems.
- Provide guidance and knowledge transfer to the Operations Team as necessary to ensure that the jobs run smoothly in production.
- Mentor and train other developers in areas of design, coding and deployment.
- Be flexible to work occasionally during non-Business hours.
The ideal candidate will:
- Have experience with Talend Studio, Talend Management Console, BitBucket (GIT).
- Have worked in a team environment coordinating Talend jobs, branches and merges with other developers.
- Have developed ETL pipelines that meet functional requirements. The pipelines will also address non-functional requirements such as performance, scalability, availability, reliability and security.
- Have familiarity with Google Cloud Platform (GCP), namely Data Fusion and Big Query. Java or Python scripting in the Google Cloud platform is a huge plus.
- Have experience writing procedures that sanitize and enrich data using either of SQL, Java or Python.
- Have a working knowledge of XML, JSON and other forms of data streaming artifacts and related technologies in a Java/Python environment.
- Have strong written and verbal communication skills
- Be willing to mentor and train other developers in areas of design, coding and deployment.
- Be able to manage work effectively in a multi-project environment.
Education and Experience and Technical Requirements:
Bachelor’s degree or equivalent experience. 5+ years with proven results in system development, implementation, and operations is required. Strong understanding of design patterns with a focus on tiered, large-scale data systems.