inimum qualifications: B
- achelor’s degrees in computer science or software engineering. M
inimum experience: 4
- + years of experience in dealing with Data Lakehouse, RPA, and SAP technologies implementation, configuration, L2 & L23 support. O
ther Skills / Experience: P
latform Implementation and Support: P
- roficient in implementing, configuring, and supporting platform technologies, including Data Lakehouse and RPA ecosystems. Skilled in providing L2/L3 support for various data & RPA platform components. P
latform Installation, Management and Administration: E
- xpertise in installing, configuring, and managing diverse technology platforms, with a focus on Data Lake & RPA platforms like Hadoop, Cloudera, UiPath. Experience in administering Data Warehouse solutions and ensuring their reliability. B
ig Data Technology Proficiency: S
- killed in supporting and troubleshooting Big Data technologies such as Hadoop, Cloudera, Spark, Impala, and Kafka. Ability to diagnose and resolve issues related to Big Data platforms effectively. E
TL Process Management: C
- ompetent in managing ETL processes and tools like CDC, Informatica, and SSIS to ensure efficient data integration and processing. O
perating Systems Administration: E
- xpertise in administering Linux RedHat and Windows Servers operating systems commonly used in Big Data environments, ensuring smooth operations. C
loud and Virtualization Expertise: P
- roficient in setting up and managing containerized servers, cloud services, and virtualization environments. Hands-on experience with cloud-based data platform management and support. D
ata Engineering Principles Understanding: F
- amiliarity with data engineering principles, data warehousing concepts, and ETL processes to optimize data workflows. P
rogramming and Scripting Familiarity: F
- amiliarity with scripting and programming languages such as Python for developing data pipelines and automating tasks. S
ource Control and CI/CD Practices: C
- ompetence in software development methodologies, source control management, and continuous integration/delivery practices. S
takeholder Communication Skills: A
- bility to effectively communicate technical concepts to both technical and non-technical stakeholders, ensuring clear understanding. R
eporting Capability: S
- kill in providing regular reports including daily task, project status, issues, and resolutions, fostering transparency and alignment with project goals.
inimum qualifications: B
- achelor’s degrees in computer science or software engineering. M
inimum experience: 4
- + years of experience in dealing with Data Lakehouse, RPA, and SAP technologies implementation, configuration, L2 & L23 support. O
ther Skills / Experience: P
latform Implementation and Support: P
- roficient in implementing, configuring, and supporting platform technologies, including Data Lakehouse and RPA ecosystems. Skilled in providing L2/L3 support for various data & RPA platform components. P
latform Installation, Management and Administration: E
- xpertise in installing, configuring, and managing diverse technology platforms, with a focus on Data Lake & RPA platforms like Hadoop, Cloudera, UiPath. Experience in administering Data Warehouse solutions and ensuring their reliability. B
ig Data Technology Proficiency: S
- killed in supporting and troubleshooting Big Data technologies such as Hadoop, Cloudera, Spark, Impala, and Kafka. Ability to diagnose and resolve issues related to Big Data platforms effectively. E
TL Process Management: C
- ompetent in managing ETL processes and tools like CDC, Informatica, and SSIS to ensure efficient data integration and processing. O
perating Systems Administration: E
- xpertise in administering Linux RedHat and Windows Servers operating systems commonly used in Big Data environments, ensuring smooth operations. C
loud and Virtualization Expertise: P
- roficient in setting up and managing containerized servers, cloud services, and virtualization environments. Hands-on experience with cloud-based data platform management and support. D
ata Engineering Principles Understanding: F
- amiliarity with data engineering principles, data warehousing concepts, and ETL processes to optimize data workflows. P
rogramming and Scripting Familiarity: F
- amiliarity with scripting and programming languages such as Python for developing data pipelines and automating tasks. S
ource Control and CI/CD Practices: C
- ompetence in software development methodologies, source control management, and continuous integration/delivery practices. S
takeholder Communication Skills: A
- bility to effectively communicate technical concepts to both technical and non-technical stakeholders, ensuring clear understanding. R
eporting Capability: S