Accountability 1
- Provide technical leadership and expertise in maintaining and evolving the Data Lakehouse, RPA.
Accountability 2
- Oversee L2 and L3 support, architecture design, problem-solving, and continuous improvement initiatives.
Accountability 3
- Install, configure, and maintain the Data Lakehouse, RPA, ensuring reliability and availability.
Accountability 4
- Design scalable and resilient architectures for the Data Lakehouse environment.
Accountability 5
- Provide mentorship to junior team members and manage stakeholder relationships.
Accountability 6
- Define technical strategies aligned with business objectives.
Accountability 7
- Oversee deployment of data assets and data products, ensuring effective implementation and integration within the Data Lakehouse ecosystem.
Accountability 8
- Play a key role in project management, reporting to senior management, and driving technical innovation within the organization.
Minimum educational qualification:
- Bachelor’s degrees in computer science / software engineering or relevant field
Minimum experience:
- 7+ years of experience in dealing with Data Lakehouse, RPA, configuration, architecture, designing, L2 & L23 Support
Other Skills & Competence:
Platform Implementation and Support:
- Proficient in implementing, configuring, and supporting platform technologies, including Data Lakehouse and RPA ecosystems. Skilled in providing L2/L3 support for various data & RPA platform components.
Platform Installation, Management and Administration:
- Expertise in installing, configuring, and managing diverse technology platforms, with a focus on Data Lake & RPA platforms like Hadoop, Cloudera, UiPath. Experience in administering Data Warehouse solutions and ensuring their reliability.
Big Data Technology Proficiency:
- Skilled in supporting and troubleshooting Big Data technologies such as Hadoop, Cloudera, Spark, Impala, and Kafka. Ability to diagnose and resolve issues related to Big Data platforms effectively.
ETL Process Management:
- Competent in managing ETL processes and tools like CDC, Informatica, and SSIS to ensure efficient data integration and processing.
Operating Systems Administration:
- Expertise in administering Linux RedHat and Windows Servers operating systems commonly used in Big Data environments, ensuring smooth operations.
Cloud and Virtualization Expertise:
- Proficient in setting up and managing containerized servers, cloud services, and virtualization environments. Hands-on experience with cloud-based data platform management and support.
Data Engineering Principles Understanding:
- Familiarity with data engineering principles, data warehousing concepts, and ETL processes to optimize data workflows.
Programming and Scripting Familiarity:
- Familiarity with scripting and programming languages such as Python for developing data pipelines and automating tasks.
Source Control and CI/CD Practices:
- Competence in software development methodologies, source control management, and continuous integration/delivery practices.
Architecture Design and Scalability:
Reporting Capability:
- Skill in providing regular reports including daily task, project status, issues, and resolutions, fostering transparency and alignment with project goals.
Architecture Design and Scalability:
- Experience in designing scalable and resilient architectures for Data Lake & RPA environments, ensuring robustness and efficiency in handling large-scale data operations.
Technical Leadership and Mentorship:
- Ability to provide technical leadership and mentorship to junior team members, guiding them in best practices and fostering a culture of continuous learning and improvement.
Project Management Expertise:
- Proficiency in project management methodologies to lead and execute projects effectively, ensuring successful delivery and alignment with business objectives.
Stakeholder Management:
- Skilled in stakeholder management and aligning technical strategies with business objectives, ensuring that technical decisions support overarching business goals.
Report job