Digital Transformation & Sustainability
We Are Hiring
We're looking for talented people with an epic personality who are constantly pursuing the next big thing.
About the Job
- 
Own and drive the architecture for a project / product 
- 
Choose the right architecture that would solve the business problems 
- 
Lead a team of technical people, mentor and guide them during the solution 
- 
Communicate and translate the technical solution to the team and other stakeholders 
- 
Drive re-use initiatives and promote re-use culture within the team 
- 
Design and implement various components of data pipeline that include data integration, storage, processing and analysis of business data 
- 
Analyze business requirements and derive conceptual model by identifying entities and the relationship between them. Identify attributes and create logical and physical models 
- 
Proficient in business process modelling, process flow modeling, data flow modeling 
- 
Proficient in creating flowcharts, process flows and data flow diagrams 
- 
Collaborate with application development and support teams on various IT projects 
- 
Develop complex modules and proof of concepts 
- 
Lead and drive performance optimization efforts 
- 
Define standards and guidelines for the project and ensure team is following those 
- 
Assist in setting strategic direction for database, infrastructure and technology through necessary research and development activities 
- 
Automation should be a key driver and the development & testing should be automated through frameworks / tools 
- 
Monitoring performance and advising any necessary infrastructure changes through capacity planning & sizing exercises 
- 
Work with various vendors like cloud providers and consultants to deliver the project 
- 
Define non functional requirements and make sure that the solution adhere to the requirements 
- 
Define best practices of agile & DevOps while implementing solutions 
Requirements
- 
Data Architecture 
- 
Oracle Autonomous data Warehouse 
- 
Data Cleansing 
- 
Data Catalogue 
- 
Kafka Data Streaming 
- 
Database ML 
- 
DR 
- 
Data Migration 
About the Job
- 
Excellent commercial awareness; inquisitive about “the why”, the bigger picture and the commercial and operational value at play; 
- 
Proven track-record in eliciting and interpreting difficult problems and facilitate clients towards sustainable solutions (via interviews, workshops, fit/gap analysis, thorough documentation and many more); 
- 
Excellent at translating business objectives into IT needs; 
- 
Experienced in partnering-up with Sale team following a solution selling approach (ideally in the SaaS/Product space) to sell-in and upsell; 
- 
Willing to travel to clients; 
- 
Fluent in English, both verbal and in writing and can communicate in a client-centric way. 
- 
Experience with Postman (or similar), Miro, Slack, Jira and Confluence is a plus; 
- 
You like having different responsibilities, working in small teams on challenging assignments. 
- 
Project management experience is a plus 
Requirements
- 
Postman 
- 
Jira 
- 
Miro 
- 
Mural 
- 
Database 
- 
Business Strategy 
- 
Functional Design Document 
- 
Business Analytics 
- 
Business Requirement Document 
About the Job
- 
Excellent written and verbal communication skills 
- 
Strong attention to detail and organizational skills 
- 
Experience with Microsoft Excel, Google Sheets, and CRM software 
- 
Familiarity with marketing and sales principles and practices 
- 
Ability to work efficiently and effectively in a remote team setting 
- 
Currently pursuing a degree in Marketing, Business Administration, or a related field, or a recent graduate 
Requirements
- 
CRM 
- 
Google Sheets 
- 
Canva 
- 
Airtable 
- 
Database 
- 
Business Analytics 
- Notion
- Process mapping
- 
Workflow Mapping 
About the Job
What you will do:
- 
Design and implement scalable, event-driven, containerized data pipelines for real-time ingestion, transformation, and analysis. 
- 
Optimize for extreme throughput and volume using distributed computing and storage frameworks and cloud native technologies. 
- 
Develop middleware and APIs for robust data communication between services. 
- 
Implement DataOps tools, including data-specific CI/CD pipelines and processes. 
- 
Maintain data security aligned with industry standards and regulatory frameworks. 
- 
Mentor junior data engineers for continuous skill and process improvement. 
Requirements
- 
CI-CD 
- 
Postman 
- 
PL-SQL 
- 
Pyhon 
- 
AWS 
- 
Kafka 
- 
Kubernetes 
- 
Teraform 
- 
Elastisearch 
About the Job
- 
System monitoring, issue escalation 
- 
Familiarity with techniques and tools for crawling, extracting, and processing data (i.e., Scrapy, Pandas, etc.), and experience running large-scale data scraping projects. 
- 
Scraping assets monitoring and support 
- 
Support existing and add new data sources for scraping 
- 
Writing reports on system status and possible issues 
- 
Running scrapers on-demand 
- 
Database development experience, focused on high volume data storage and retrieval applications 
- 
Database development experience, focused on high-volume data storage and retrieval applicationss beyond those best practices 
- 
Fluency and excellent communication skills in English are must in order to work with an international team. 
Requirements
- 
Python 
- 
Scrapy 
- 
Beautiful-Soup 
- 
Pandas 
- Proxy-rotation
- 
JSON 
- 
XML 
- 
CSS 
- 
XHR