Accountability 1
- Install, configure, and maintain the Data Lakehouse, RPA and SAP Platforms.
Accountability 2
- Manage L2/L3 support for the Data Lakehouse, RPA, and SAP technologies, ensuring reliability and availability.
Accountability 3
- Diagnose and solve data and platform issues promptly and efficiently.
Accountability 4
- Evolve the Data Lakehouse according to data integration, processing, and output requirements.
Accountability 5
- Deploy data assets and solutions, ensuring effective implementation and integration within the Data Lake ecosystem.
Accountability 6
- Assess and manage technical and human resources effectively.
Accountability 7
- Initiate optimization processes to enhance functionality and improve overall operations management.
Accountability 8
- Provide regular reports on project status, issues, and resolutions.
Minimum qualifications:
- Bachelor’s degrees in computer science or software engineering.
Minimum experience:
- 4+ years of experience in dealing with Data Lakehouse, RPA, and SAP technologies implementation, configuration, L2 & L23 support.
Other Skills / Experience:
Platform Implementation and Support:
- Proficient in implementing, configuring, and supporting platform technologies, including Data Lakehouse and RPA ecosystems. Skilled in providing L2/L3 support for various data & RPA platform components.
Platform Installation, Management and Administration:
- Expertise in installing, configuring, and managing diverse technology platforms, with a focus on Data Lake & RPA platforms like Hadoop, Cloudera, UiPath. Experience in administering Data Warehouse solutions and ensuring their reliability.
Big Data Technology Proficiency:
- Skilled in supporting and troubleshooting Big Data technologies such as Hadoop, Cloudera, Spark, Impala, and Kafka. Ability to diagnose and resolve issues related to Big Data platforms effectively.
ETL Process Management:
- Competent in managing ETL processes and tools like CDC, Informatica, and SSIS to ensure efficient data integration and processing.
Operating Systems Administration:
- Expertise in administering Linux RedHat and Windows Servers operating systems commonly used in Big Data environments, ensuring smooth operations.
Cloud and Virtualization Expertise:
- Proficient in setting up and managing containerized servers, cloud services, and virtualization environments. Hands-on experience with cloud-based data platform management and support.
Data Engineering Principles Understanding:
- Familiarity with data engineering principles, data warehousing concepts, and ETL processes to optimize data workflows.
Programming and Scripting Familiarity:
- Familiarity with scripting and programming languages such as Python for developing data pipelines and automating tasks.
Source Control and CI/CD Practices:
- Competence in software development methodologies, source control management, and continuous integration/delivery practices.
Stakeholder Communication Skills:
- Ability to effectively communicate technical concepts to both technical and non-technical stakeholders, ensuring clear understanding.
Reporting Capability:
- Skill in providing regular reports including daily task, project status, issues, and resolutions, fostering transparency and alignment with project goals.
Report job