Sr. Python Developer/data Engineer Resume
Louisville -, KY
SUMMARY
- Around 7 years of experience as an Analysis, Design, Development, Management and Implementation of various stand - alone, client-server enterprise applications using Python.
- Strong experience in working with different python editors like PyCharm, PyScripter, PyStudio, Sublime Text, Emacs and Spyder
- Experienced in Agile Methodologies, Scrum stories and sprints experience in a Python based environment, along with data analytics, data wrangling and Excel data extracts.
- Good experience on different phases of Software Testing and Software Development Life Cycle (SDLC) including Agile Methodology and Waterfall.
- Implemented automation using Selenium WebDriver, Python.
- Strong Experience in Automating Web Application Testing using Selenium WebDriver with Testing Framework.
- Experience in Python, Django, Zope, Pyramid Framework
- Expertise in locating the web elements using XPATH and CSS.
- Experienced in Scripting languages such as Python, Scala, PHP, JavaScript and Bash, Shell Scripting
- Worked on Python Testing frameworks like zope, pytest and Robot framework.
- Building Data Pipelines using PySpark, Sqoop, AWS Glue, AWS Lambda, Shell Script, Docker, Kubernetesand Mongo DB.
- Vigorous knowledge in progressing web applicationsand effectuating Model ViewControl architecture using Django web application framework.
- Developed business CRUD operations using RESTFUL services.
- Experience in Development and testing ETL data processes
- Wrote python scripts to parse XML and JSON data and load the data into database.
- Experience in developing web-based applications using Python,Django, Flask, REST, JSON, CSS, HTML5, Java Script andGood experience in using ORM
- Hands on using jinja2 template and Django template engines.
- Worked on various applications using python integrated IDEs like Visual Studio, PyCharm.
- Well expertise in Automating the Validations using Java, python and Java Scripts.
- Utilized AWS services like S3, AWS Glue, AWS Lambda, SNS, CloudWatch, EC2, DynamoDB, SQS, Batch Job, Code Commit for various tasks.
- Experience in working in DataLake team and different layers like landing, raw and cleansed.
- Expertise in Object-Oriented design and coding. Good knowledge of various Design Patterns and UML.
- Experienced in unit testing and testing using Selenium.
- Good Experience on Selenium IDE and creating Scripts in selenium --RC by using Python
- Good experience in Shell Scripting, SQL Server, Unix and Linux, Open stock.
- Hands on experience with Amazon web services (AWS) and Amazon cloud technologies such as Amazon EC2 (virtual servers) and Amazon Cloud Watch (monitoring).
- Expertise in Object-Oriented design and coding. Good knowledge of various Design Patterns and UML.
- Good analytical and problem-solving skills and ability to work on own besides being a valuable and contributing team pla yer.
TECHNICAL SKILLS
Frameworks: Django, Flask, Zope, PyJamas, Python, Struts and CSS Bootstrap
Web Technologies: HTML, CSS, Java Script, JQuery, AJAX, XML, AngularJS, NodeJS
Programming Languages: Python, Java, C/C++, PERL, SQL and PL/SQL.
Scripting languages: CSS, AJAX, Java Script, JQuery, PHP, PowerShell Scripting
Versioning Tools: SVN, CVS, Git, GitHub, Bambo, BitBucket
Databases: Oracle (9i,10g, 11g), My SQL, MongoDB
IDE’s/ Development Tools: NetBeans, Eclipse, PyCharm, PyScripter, PyStudio, and Sublime Text.
Operating Systems: Windows, Red hat Linux 4.x/5.x/6.x, MAC OSX.
Protocols: TCP/IP, HTTP/HTTPS, SOAP, SMTP
Testing Tools: Selenium,Bugzillaand JIRA.
PROFESSIONAL EXPERIENCE
Confidential, Louisville - KY
Sr. Python Developer/Data Engineer
Responsibilities:
- Designed and Built Data pipelines to serve various business use cases such as ingesting into AWS S3 Data Lake, exporting data to consumers like ServiceNow, Azure, Sales Force, Oracle Cloud etc.
- Building data pipelines using PySpark, Sqoop, Spark, Scala, SQL, AWS Glue and AWS Lambda and loading data into Snowflake .
- Performed data exchange between systems in various formats such as JSON, React, CSV, Parquet, etc.
- Designed AWS Lambda functions in Python an enabler for triggering the shell script to ingest the data into Mongo DB and exporting data from Mongo DB to consumers.
- Designed Collections, performed CRUD operations, utilized aggregate pipelines in MongoDB.
- Implemented techniques for performance optimization such as Indexes, shard keys for Efficiency.
- Utilized AWS services like EC2, Lambda, Glue, DynamoDB, S3, KMS, CloudWatch, SNS, SQS for various data engineering tasks.
- Designed data engineering pipelines with several capabilities including data ingestion, Error Handling, capturing logs, scheduling, Checks and balances, event trigger, generating extracts, posting messages on MQ etc. in Python and AWS Lambda.
- Experience with Angular or another client-side PHP and JavaScript framework
- Worked on ETL Testing- Data Migration and tracking the Data Quality Check.
- Creating pipelines to load data into Snowflake using external stages to support downstream power BI dashboard.
- Experience with wrapper scripts using Unix, Spark, Scala, Sqoop, Spark SQL, Hive QL, Python.
- Experience with deploying via Docker and Kubernetes.
- Experience developing complex applications using Java and Spring Boot, React, Mongo DB and AWS cloud.
- Worked with the business users on User Acceptance Testing by mentoring them on various aspects of API testing using Python.
- Built a pipeline to post messages to on premises IBM MQ from Local using Python code with Pymqi .
- Developed visual reports, dashboards and KPI scorecards using Power BI desktop.
- Created Glue scripts using PySpark and AWS Glue libraries using Python.
- Scanning DynamoDB table for attributes, updating items, auditing records count and creating Tables in DynamoDB.
- Proactively analyzed the output file as part of production roll out.
- Deploying code to different environment and perform thorough testing in each environment using Code Commit .
- Implemented row level security on data and have an understanding of application security layer models in Power BI.
- Developed a fully automated continuous integration system using Git, Jenkins and custom tools developed in Python.
- Experience in troubleshooting all types of ETL issues and Good Experience on Docker and Kubernetes.
- Experience with Angular, CSS, JavaScript, Typescript, PHP, and HTML Development.
- Experience with components in Hadoop ecosystem (Hive, Impala, Sqoop, Zookeeper, Mahout)
- Perform Unit testing, System Integration Testing, Regression Testing.
- Extensively used JIRA for project planning such creating stories, tasks, and log activities on day-to-day basis.
- Working successfully in a team environment using Agile Scrum methodologies.
Environment: Python, Shell Script, MongoDB, AWS, Linux, Angular, SQL, JIRA, Agile, Bit Bucket, Code Commit, Git Bash, Power BI, API, Java, PHP, React, FPySpark, AWS Lambda, Sqoop, Docker, Kubernetes, SNS, S3, EC2, CloudWatch, SQS, DynamoDB, EMR, ETL, HTML, CSS, JavaScript, Snowflake, Kafka, Azure, CI/CD, Jenkins.
Confidential, Columbia - SC
Sr. Python Developer
Responsibilities:
- Build Data pipelines using PySpark, Sqoop, SparkSQL and AWS services and load data to s3 buckets for consumers.
- Write python scripts to integrate with different systems using RESTful services.
- Working with different teams and gather the requirements and build the logics as per the requirements.
- Created Glue scripts using PySpark and Azure AWS Glue libraries using Python.
- Designed data engineering pipelines with several capabilities including data ingestion, Error Handling, capturing logs, scheduling, Checks and balances, event trigger, generating extracts, posting messages on MQ etc. in Python and AWS Lambda.
- Used Python, to place data into JSON files for testing Django Websites.
- Experience on working with Power BI and Snowflake in Direct Query mode.
- Utilized AWS services like EC2, Lambda, Glue, DynamoDB, S3, KMS, Docker, Kubernetes, CloudWatch, SNS, SQS for various data engineering tasks.
- Updated and manipulated content and files by using Python scripts.
- Used Django configuration to manage URLs and application parameters.
- Generated Python Django Forms to Record data of online users.
- Developed tools using Python, XML to automate some of the menial tasks.
- Experience on J2EE technologies, including Spring Boot and J2EE and core Java packages.
- Experience developing complex applications using Angular.
- Deploy code to different environments using Azure AWS code commit.
- Created a build framework to capture Power BI user tracking/logging, usage monitoring, anomalies detection in published data sets.
- Used JIRA for project planning such as creating stories, tasks and log activities on day-to-day basis.
- Knowledge of ETL tools to execute data load jobs and analyze job failure & log issues.
- Developed a fully automated continuous integration system using Git, Jenkins and custom tools developed in Python.
- Perform Unit testing, System Integration Testing, Regression Testing.
- Worked on development of SQL and stored procedures on MySQL Server.
Environment: PySpark, SparkSQL, AWS, Azure, Python, Java, JSON, Angular,ETL, Sqoop, Power BI, Django, Docker, Kubernetes, XML, Jira, MySQL Server.