NISHANT KETU

Digital Specialist Engineer

Work Experience

Digital Specialist Engineer
- Infosys
  • Created data pipelines, executes ETL operations to move large volumes of data from a GCP bucket to an external VPC network.
  • Utilized Dataproc to create and manage pipelines, monitored workflows with Airflow, ensured seamless data operations and timely execution.
  • Focused on simulating mail traffic on Pub/Sub utilizing GCS bucket. Tasked with creating a dataset to mimic mail traffic patterns and triggers for a screwdriver.
  • Worked on backend development for data migration platform tool, facilitating transfer of user data between collaboration platforms specializing in API integration, data mapping, and data migration processes.
  • Developed an e-commerce application, incorporating functionalities such as user authentication, product catalog management, and seamless shopping cart operations.
Oct 2022 - Present
Bangalore
Data Engineer Intern
- Cognizant
  • Interned as a Data Engineer/Analyst, implementing scalable solutions for complex data.
  • Developed data pipelines and solutions to drive data-driven decision-making.
  • Experience includes Flight Data Analysis, addressing complex data challenges using SQL, PySpark, and Hadoop.
Feb 2022 - May 2022
Bangalore

Education

Bachelor of Engineering
2018 - 2022
JSS Academy Of Technical Education
Software Engineering
Scaler Academy

Projects

Video Processing Service
  • Developed a video processing service enabling users to upload videos, which are then made available in multiple resolutions.
  • Leveraged cloud storage for user uploads, triggering Pub/Sub for event notification.
  • Utilized Cloud Run as the backend deployed service, with user authentication managed through Firestore.
Django E-commerce Backend
  • Designed data models and integrated MySQL, managing database tables with Django Migrations.
  • Utilized Django ORM for efficient data retrieval.
  • Implemented authentication, secured APIs, conducted testing, and optimized performance.
  • Leveraged Redis for data caching and Celery for background jobs, resulting in a robust and scalable e-commerce backend.