European University
- Developed backend functionalities and RESTful APIs to enhance the university website's performance and functionality.
- Implemented quality assurance measures through comprehensive unit tests for backend code.
- Integrated Elasticsearch for efficient search functionality and Redis for caching to enhance system performance.
- Collaborated closely with frontend developers to seamlessly integrate backend functionalities into the Wagtail CMS
- Optimized and refactored the existing codebase to improve maintainability and scalability.
- Deployed the application using Docker containers to ensure consistency across environments.
- Received positive feedback for significantly improving website performance and user experience.
Insurance Company - Airflow project
- Developed and implemented data migration workflows using Apache Airflow in a microservice architecture.
- Designed and scheduled Directed Acyclic Graphs (DAGs) to automate data transfers between services.
- Integrated with multiple APIs, including Salesforce, for seamless data exchange.
- Collaborated closely with other teams to ensure smooth data flow.
- Employed pytest for thorough testing and validation.
- Utilized Django and Python for backend development, ensuring robust data processing
Insurance Company
Developed microservices for an insurance company catering to user registration, appointment scheduling with doctors, service orders (e.g., plumbers), and access to insurance offerings.
- Leveraged technologies like Python DRF, Docker, Kubernetes, and more to create a scalable, responsive, and user-friendly platform.
- Implemented a microservices architectural pattern for modular, scalable, and loosely-coupled services that collaborate seamlessly. Constructed robust APIs using Python and Django Rest Framework to facilitate smooth communication between microservices components.
- Containerization with Docker: Utilized Docker for containerization to encapsulate microservices, ensuring consistent and efficient deployment across diverse environments.
Amazon Product Scraper
- Designed and implemented an Amazon product scraper using Python3, Scrapy, ElasticSearch, Docker, and StackDriver.
- Extracted product listings and specifications from Amazon efficiently.
- Stored data in ElasticSearch for further analysis and insights.
- Conducted systematic scraping of product information from Amazon.
- Contributed to market insights and competitive analysis through comprehensive data collection
Data Extraction and Quality Assurance
- Utilized Scrapy, Scrapy-Splash, Selenium, and Docker to extract content from diverse documents and websites.
- Converted unstructured data into XML files, ensuring compatibility and ease of processing.
- Conducted quality assurance to validate the accuracy and reliability of extracted data.
- Managed the intricate process of scraping laws from all 50 U.S. states.
- Demonstrated expertise in data extraction, transformation, and quality control throughout the project lifecycle.
Web Scraping and Data Transformation
- Led a project utilizing Python, Scrapy, Scrapy-Splash, and Docker for extracting data from various websites.
- Transformed extracted data into JSON files, ensuring compatibility and ease of processing.
- Maintained reliable scrapers to sustain data extraction operations.
- Focused on automation and regulatory data scraping from multiple countries.
- Demonstrated expertise in web scraping, data transformation, and process automation throughout the project.