Summary
Overview
Work History
Education
Skills
Websites
Certification
Languages
Timeline
Generic

Kavitha Rajan

Osnabrueck

Summary

  • Highly skilled professional with over 9 years of expertise in designing and optimizing scalable data solutions using cloud platforms like AWS, ETL/ELT pipelines, and advanced SQL.
  • Skilled in data integration and virtualization (Denodo, CData), visualization (Power BI, Tableau), SaaS (Snowflake) and automation tools such as Python.
  • Experience in Jenkins for CI/CD pipelines, infrastructure as code (Terraform), and GitOps principles

Overview

8
8
years of professional experience
1
1
Certification

Work History

Data Engineer/Data Scientist

KeepLocal GmbH
01.2024 - 04.2024
  • Developed PowerBI dashboards, administered user permissions, and scheduled data refreshes to optimize performance and ensure data accuracy
  • Implemented scalable data pipelines using AWS services (DynamoDB, Glue, S3, Lambda, Redshift) to extract, transform, and load (ETL) structured and unstructured data
  • Automated data transfer processes, ensuring reliability and scalability for large data volumes
  • Created ETL jobs in AWS Glue using PySpark and Python scripts
  • Designed and provisioned cloud infrastructure using Terraform, ensuring efficient deployment and management of AWS resources
  • Proficient in Jira, Confluence, and Git for version control, ensuring effective collaboration across teams

Data Engineering Specialist

Accenture
03.2022 - 05.2023
  • Worked extensively with RDBMS, dimensional, and NoSQL data sources to design and implement scalable data integration
  • Was part of Middleware Data Integration team and worked on middleware platforms like CData, Denodo, ACE and APIC
  • Integrated structured and unstructured data from various sources like AWS S3, SQL, Snowflake, SAP HANA, Oracle, TM1, and IBP using Denodo Platform
  • Created unified Data Virtualization layer in Denodo Platform for data analysis
  • Expertise in working with APIs (REST, SOAP, GraphQL) and API testing using Postman
  • Proficient in working with JSON and XML data files to transform and process data for business insights and operational efficiency
  • Created comprehensive documentation for the development process, including high-level, technical specifications, and unit test documents
  • Conducted data quality checks, performance testing, and implemented governance protocols to ensure pipeline reliability, security, and compliance with organizational standards
  • Developed Tableau dashboards for data analysis

DevOps Engineer

Mindtree
08.2021 - 03.2022
  • Leveraged Denodo Platform to create a virtual data layer, integrating data from diverse sources for seamless access and real-time insights
  • Designed and implemented CI/CD pipelines in Jenkins for automating the deployment of Denodo virtual data artifacts across multiple environments (Development, QA, and Production)
  • Automated the export and deployment of Denodo VQL scripts using the Denodo CLI integrated into Jenkins pipelines
  • Integrated pipeline stages for validation, deployment, and testing to ensure production-readiness
  • Enabled cross-team collaboration by streamlining the process of managing virtual databases and views

Senior Software Developer

UST Global
11.2015 - 07.2021
  • Being a member of DevOps team, involved in creating design documents, Technical design of code, Development in Denodo Platform, Unit testing, Deployment to QA and Production using Solution manager and provide production support
  • Created and managed Denodo views, including Base, Derived, Union, and Join, to support data virtualization
  • Created Unix shell scripts to interact with Oracle databases, automating data extraction, transformation, and loading processes for improved efficiency and reliability
  • Expert in SQL, with extensive experience in writing complex queries, optimizing database performance, and developing PL/SQL procedures, functions, triggers, and views to support business and analytical needs
  • Developed shell scripts on AIX, monitored CPU with APPDynamics, and managed Oracle & Teradata SQL databases
  • Provided expertise in mainframe technologies such as COBOL, JCL, IBM DB2, and Unix/Linux server administration

Education

Master of Science - Psychology

Alagappa University
India
01.2021

Bachelor of Engineering - Electronics and Communication

Cochin University of Science and Technology
India
01.2015

Skills

  • Cloud Platforms: AWS (DynamoDB, Glue, S3, Lambda, Redshift, CodeCommit), Azure DevOps
  • Data Tools: Denodo, PowerBI, Tableau, Snowflake
  • Data Management: ETL/ELT, Data Warehousing, Data Lake, RDBMS (Oracle, IBM DB2, Teradata), SAP HANA, TM1, IBP
  • DevOps: CI/CD Pipelines, Jenkins, GitHub Actions
  • API Development: RESTful Web Services, JSON, Postman
  • Version Control & Collaboration: Git, Jira, Confluence, GitHub
  • Other Skills: Unix/Linux Administration, ITIL, Production/Application/Technical Support, Debugging, Testing, Business Process Management

Certification

  • Certified AWS Solution Architect, Simplilearn, 10/01/20, 2190049
  • Certified Data Virtualization Developer, Denodo Technologies, 04/01/19, 7 and 8

Languages

Fluent, Conversational (B1)

Timeline

Data Engineer/Data Scientist

KeepLocal GmbH
01.2024 - 04.2024

Data Engineering Specialist

Accenture
03.2022 - 05.2023

DevOps Engineer

Mindtree
08.2021 - 03.2022

Senior Software Developer

UST Global
11.2015 - 07.2021

Master of Science - Psychology

Alagappa University

Bachelor of Engineering - Electronics and Communication

Cochin University of Science and Technology
Kavitha Rajan