Full - Stack Python Developer with a strong focus on Microservices. Python expert with recent exposure to PySpark, Kafka, MQ. Built highly scalable Ad tech, Social Media platforms.
Designed scalable and automated infrastructure, Active contributor to open source projects. DevOps expert with recent work on Terraform, Docker, etc.
Also used AWS and Inhouse products. Effective as an individual contributor.
Expertise in Python, Django, design patterns, libraries (NumPy, SciPy, matplotlib, Pandas, SQLAlchemy), Confidential Web Services (AWS), NoSQL (MongoDB, Cassandra) and Relational databases (MySQL, SQL Server, Oracle, Postgres), Big Data and Hadoop technologies working in full Software Development Life Cycle (SDLC).
Experienced in developing Web Services with Python programming language - implementing JSON basedRESTfulandXMLbasedSOAPwebservices. Hands-on experience on SCM tools like GIT, containers like Docker and deployed the project into Jenkins using GIT version control system.
Experienced in Working on Big Data Integration and Analytics based on Hadoop, PySpark and No-SQL databases like HBase and MongoDB. Skilled in Developing Microservices based on Restful web service using Akka Actors and Akka-Http framework in Scala which handles high concurrency and high volume of traffic.
Having hands experience on JSON modules of python to call web service to handle the multi-processing created a celery tasks with the help of message broker RabbitMQ. Queuing architectures were accomplished with RabbitMQ for scalability, performance and building.
Experienced on working with Docker for creating images that are deployed on AWS as Microservices and Experienced on managing the local deployments in Kubernetes and creating local cluster and deploying application containers. Good experience working with container orchestration tools Jenkins, Kubernetes.
Extensive experience in web application development using Python with Django Framework and web technologies using Object Oriented Programming concepts to create scalable and robust components which can be used across the whole framework.
Experienced in creating some Single Page Applications using React.js and working with React Flux architecture. Good experience in Confidential Web Services (AWS) environment and good knowledge of AWS services like Lambda services, DynamoDB, S3, RDS, SQS, SNS, CloudWatch, CloudFormation, Elastic Compute Cloud(EC2), Kinesis, Aurora.
Experienced on working with Docker for creating images that are deployed on AWS as Microservices and Experienced on managing the local deployments in Kubernetes and creating local cluster and deploying application containers.
Experience with JSON based REST Web services and Confidential Web services (AWS) and Responsible for setting up Python REST API framework using Django.
Expertise in implementing Service Oriented Architecture by using SOAP and RESTful web services. Experience in implementing Data warehouse solutions in AWS Redshift worked on various projects to migrate data from on database to AWS Redshift, RDS and S3.
Handy experience on using Big Data Cloud Services like PIG, HIVE and Kafka for Map reducing by which large amount of data can be easily analyzed. Good knowledge and experience in working with Relational / SQL (MySQL, PostgreSQL, MS SQL, Oracle) and Non - Relational (MongoDB, Couchbase) databases.
Experience working in Software Development Life Cycle (SDLC) environment using Agile Methodologies. Ability to develop algorithms for descriptive analysis and predictive analysis. Adept in providing analytical support to key business applications/ solutions.
FULL STACK PYTHON DEVELOPER/DATA ENGINEER
Confidential - NEW YORK, NY
Developed a highly flexible system for listings and handled end-to-end development from developing APIs in Django and front-end in React to the deployment of various features. A Python/Django app for sending/receiving web notifications Python client for different web API Perform tasks like authentication.
Developing an application by writing server-side code with Node.js and Express framework, a database to store and retrieve data with MongoDB and front-end web pages designing with React.js. Built reusable and customizable components for the new website using React.js and React-Routes to create a single page web application.
Implemented Django models to build all database mapping classes and used Python scripts to update content in the database and manipulate files. Created Python script to monitor server load performance in production environment and horizontally scale the servers by deploying new instances.
Successfully migrated the Django database from SQLite to MySQL to PostgreSQL with complete data integrity and Designed, developed and deployed CSV Parsing using the Big Data approach on AWS EC2.
Developed a fully automated continuous integration system using Git, MySQL, Jenkins, and custom tools developed in Python. Managed datasets using Panda data frames and MySQL, queried MySQL database queries from Python using Python -MySQL connector MySQL db. package to retrieve information.
Used Spark Streaming APIs to perform transformations and actions on the fly for building common learner data model which gets the data from Kafka in Near real time and persist it to Cassandra. Kubernetes is being used to orchestrate the deployment, scaling and management of Docker Containers.
Used Jenkins pipelines to drive all micro services builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes. Worked with lxml to dynamically generate SOAP requests based on the services. Developed custom Hash-Key (HMAC) based algorithm in Python for Web Service authentication.
Used Docker to implement a high-level API to provide lightweight containers that run processes isolation and worked on creation of customized Docker container images, tagged and pushed the images to the Docker repository. Developed Restful API's using Python Flask and SQLAlchemy data models as well as ensured code quality by writing unit tests using Pytest.
Worked on Linux server and Created scripts for data modeling and data import and export and worked on Python ecosystem packages like Numpy, pandas, Matplotlib and Ipython Jupyter notebook. Developed Python Mapper and Reducer scripts and implemented them using Hadoop streaming.
Created and maintained Technical documentation for launching Hadoop Clusters and for executing Hive queries and Pig Scripts. Development of Python APIs to dump the array structures in the Processor at the failure point for debugging and good knowledge on Spark platform parameters like memory, cores and executors.
DesignedAWS Lambda functionsin Python an enabler for triggering the shell script to ingest the data into Mongo DB and exporting data fromMongo DBto consumers. Developed data transition programs from DynamoDB to AWS Redshift (ETL Process) using AWS Lambda by creating functions in Python for the certain events based on use cases.
Wrote all the microservices in Python utilizing distributed message passing via Kafka message broker with JSON as data exchange formats. Responsible for the Automation of the deployment of the Conductor application on AWS lambda using high-end AWS architectural components.
Analyzed all service endpoints to identify redundant calls with external business service, provide suggestion to cache, indexing columns in RDS, Creating new GSI index for DynamoDB, and code improvements, migrating to Python Lambda functions.
Deployed Airflow (Celery Executor) on S3 instances mounted to EFS as central directory with broker as SQS and stored metadata in RDS and logs to S3 Buckets. Real time streaming the data using Spark with SQS. Responsible for handling Streaming data from web server console logs.
Triggered Lambda/Python from S3 through SNS to process the payload, make external call with External Vendor API and then capture the response in S3 log Bucket. Worked on enabling SES and SNS notification services to alert the user about faulty runs, delayed runs and extracted data details.
Used Spark Streaming APIs to perform transformations and actions on the fly for building common learner data model which gets the data from Kafka in Near real time and persist it to Cassandra. Performed automation of CI/CD pipeline in private cloud using Jenkins shared libraries and multibranch pipeline and automated the static code analysis through SonarQube in Jenkins pipeline for checking code quality.
Developed custom Ansible playbook and integrated in Jenkins post configuration for setting up the automated build pipeline for GIT repository projects and developed CI/CD system with Jenkins on Kubernetes container environment.
Developed frontend and backend modules using Python on Django including Web Framework using Git and GitHub. Exposure in working for developing a portal to manage and entities in a content management system using Flask.
Involved in analysis, specification, design, and implementation and testing phases of Software Development Life Cycle (SDLC) and used agile methodology for developing application.
Wrote prepared statements and called stored Procedures using callable statements in MySQL. Developed tools using Python, Shell scripting, XML to automate some of the menial tasks. Worked on migration to AWS (cloud) instances. Worked on Jenkins for CI/CD in production environment.
Involved in setting upKubernetesfor clustering & orchestrating Docker containers for runningmicroservicesby creating Pods. Worked withDockerandKuberneteson multiple cloud providers, from helping developers build and containerize their application(CI/CD)to deploying either on the public or private cloud.
BuildingDocker imagesand checking in toAWS ECRfor Kubernetes deployment. Used AWS CloudWatch for performing Monitoring, customized metrics and file logging and successfully completed the task in using AWS Lambda to manage the servers and run the code in the AWS.
Implemented Django models to build all database mapping classes and used Python scripts to update content in the database and manipulate files. Involved in development of Web Services using SOAP for sending and getting data from the external interface in the XML format. Used Python library BeautifulSoup for web scraping to extract data for building graphs.
Using Django evolution and manual SQL modifications was able to modify Django models while retaining all data, while site was in production mode. Worked with JSON based REST Web services and Created a Git repository and added the project to GitHub.
Implemented a'server less'architecture usingAPI Gateway, Lambda, and Dynamo DBand deployedAWS Lambda codefrom Confidential S3 buckets. Created a Lambda Deployment function, and configured it to receive events from your S3 bucket.
Orchestratedmultiple ETL jobsusingAWS step functionsandlambda, also usedAWS Gluefor loading and preparingdata Analyticsfor customers. Worked withDockerandKuberneteson multiple cloud providers, from helping developers build and containerize their application(CI/CD)to deploying either on the public or private cloud.
UsedApache CouchDB(NoSQL) in AWS Linux instance in parallel toRDS MySQLto store and analyze job market information. Involved in migration of databases from on premise to AWS RDS, Migrated the MySQL and MsSQL database servers using Database migration service in AWS.
Established AWS direct connect between client's data center and AWS data center location and had deployed. Developed AWS cloud formation templates and setting up Auto scaling for EC2 instances. Created Dynamo DB tables in EC2 instances with auto-scaling enabled to the table and have assigned the read and write capacity accordingly.
Designed the data models to be used in data intensiveAWS Lambdaapplications which are aimed to do complex analysis creating analytical reports for end-to-end traceability, lineage, definition of Key Business elements fromAurora.
Used Confidential SNS to send notifications to alert administrators of problems with the databases. Used Confidential Cloud EC2 along with Confidential SQS to upload and retrieve project history. Using Confidential SQS to queue up work to run asynchronously on distributed Confidential EC2 nodes.
Written bash and python scripts integrating Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMI's and scheduling lambda functions for routine AWS tasks. Used AWS Cloud Watch to monitor and store logging information. Deployed project into Confidential web services (AWS) using Confidential elastic bean stalk.
Responsible for delivering datasets from Snowflake to One Lake Data Warehouse and built CI/CD pipeline using Jenkins and AWS lambda and Importing data from DynamoDB to Redshift in Batches using Confidential Batch using TWS scheduler.
Dealing with data stored in AWS using Elastic MapReduce and Redshift PostgreSQL. Wrote Ansible Playbooks to automate process of creating the Master and Worked on nodes in Kubernetes Environment.
Involved in Developing a Restful API'S service using Python Flask framework. Successfully migrated the Django database from SQLite to MySQL to PostgreSQL with complete data integrity. Worked on REST Web services as well as Node.js REST framework for backend services, used Mongo DB (NoSQL) for database services.
Used data structures like directories, tuples, object-oriented class-based inheritance features for making complex algorithms of networks. Extensive working experience in agile environment using a CI/CD model methodology.
SOFTWARE ENGINEER (PYTHON)
Utilize PyUnit, the Python unit test framework, for all Python applications and rewrite existing application in Python module to deliver certain format of data and developed Python batch processors to consume and produce various feeds.
Predominant practice of Python Matplotlib package and Tableau to visualize and graphically analyses the data. Data pre-processing, splitting the identified data set into Training set and Test set using other libraries in python.
Developed consumer-based features and applications using Python and Django in test driven Development. Successfully migrated the Django database from SQL to PostgreSQL with complete data integrity. Worked on changes to open stack and AWS to accommodate large-scale data center deployment.
Worked in MySQL database on simple queries and writing Stored Procedures for normalization. Learned to index and search/query large number of documents inside Elastic search.
Worked with WEB API’s to make calls to the web services using URLs, which would perform GET, PUT, POST and DELETE operations on the server. Created stored procedure, trigger on the database to provide/insert specific data from multiple tables for Web API services.