Mongodb And Web Developer Resume
Charlotte, NC
SUMMARY:
- Software developer wif 9 years of experience in top Banking/financial, capital markets, credit risk, investment banking, marketing, aml and data warehousing domains.
- Solved problems in both agile and waterfall model.
- Refactored client side web application code using python, REST interface and MVC frameworks.
- Developed both procedural and object oriented perl/CGI scripts for ETL process.
- Expert noledge on mongodb no sql data modeling, tuning, disaster recovery and backup.
- Worked as POC for initial mongodb clusters, Hadoop clusters and various Teradata servers and successfully tested, on boarded and performed basic admin and dba tasks.
- Rich server side coding using PHP oops for in - house banking applications on windows platform.
- Worked on full stack lamp development using linix, apache, Teradata/mongodb and python/perl.
- Implemented various design patterns like singleton, MVC and DAO.
- Configuration and distributed version control using svn and GIT.
- Developed various data models, backup and recovery solutions using oracle RMAN, datapump, mysql import and export.
- Real time experience in data analytics, reporting, data linage, disaster recovery, archiving and retention management of multiple application databases at a time on Teradata and Hadoop.
- Worked on various incident and change management control tools like jira, nexus, hp quality center, BMC remedy and maximo.
- Have expert noledge on autosys scheduler and basic noledge on mainframe CA7.
- Have designed and developed various analytical sql queries and pl/sql programs.
- Having experience in ETL tools, and strong noledge in Teradata DATAMOVER and Teradata ETL utilities like BTEQ, ARCMain and TPT.
- Have 3+ years of basic experience in html, XML, JSON, MONGO DB and PHP development.
- Working experience in Cloudera Hadoop, Knowledge of HDFS, Yarn and MapReduce architecture and managed basic admin functionalities.
- Good understanding and basic development of big data technologies like PIG, HIVE, HBASE and SQOOP in UNIX and using DMexpress tool and automating tactical solutions using python/perl.
- Have worked on writing light weight PIG latin statements and moving data from Teradata to Hadoop using SQOOP and wrote small to medium HIVE queries in large data warehouse team.
- Automated various data transformations like Flat Files, CSV, TSV, XML and Excel to tables and json and vice versa.
TECHNICAL SKILLS:
- 2+ years of experience on Mongodb clusters, JSON, xml, informatica, python and javascript.
- 8+ years of advanced experience in Perl/CGI, Linux/Unix bashes scripting, Autosys scheduler.
- 5+ years of expert experience in Teradata, Mysql, PHP, and xml, FTP, NDM and IBM Mainframe.
- 3+ years of basic experience in Oracle, PL/SQL, Cloudera Hadoop, PIG, HIVE, SQOOP, syncsort DM express.
- Experience in installing and monitoring nodes on Linux, SVN subversion, GIT, phpstorm.
- Worked on change management tools like maximo, BMC remedy, jira and hp quality center.
PROFESSIONAL EXPERIENCE:
Confidential, Charlotte, NC
Mongodb and web developer
Responsibilities:
- Migrated data that was mainly on Teradata, and some portion on oracle to Mongodb (~17 TB).
- Data modeled and denormalized various RDBMS tables in Mongodb json document model.
- Configured mongo clusters on red hat, have complete understanding on mongodb CRUD, indexing, replication and sharding techniques. Also have past experience on DBA functions.
- Developed reusable application code using Perl object oriented scripting and Teradata utilities BTEQ, ARC, TPT on Linux platform to pull Teradata data in to mongo json and other file formats.
- Developed validation and sanity checks code using Perl for loading accurate data in to mongodb.
- Defined technical standards and high level and low level designs for mongo clusters.
- Refactored python code to verify reconciliation, throttle limit and generate heat map reports.
- Identified proper delta extract criteria, indexes and reduced amount of data across teh network.
- Analyzed and refactored existing network based PHP code and converted to use mongodb in all web apps, parsed heterogeneous data files like xml, csv, json into mongodb and viceversa.
- Worked on MVC and RESTful frameworks on ZEND servers. Enhanced report generation process for applications and improved more TEMPthan 40%. Performance due to migrating to mongo.
- Developed value adds on JavaScript, python and mongo shell scripting to reduce time spent on repeated administration tasks like user creations, table creation, user access management etc and used java script and dynamic html for rendering repeated processed GUI data on client side.
- Mentored and trained various team members and clients on mongodb and tuning techniques.
- Have basic to advanced noledge on oracle datapump, RMAN, RAC and other operations.
- Configured PHP mongo drivers, pymongo drivers, autosys, perl, ndm and ftp for initial setup.
- Developed and tuned various analytical queries for users for initial validations and testing.
- Worked on integrating wif Hadoop data for hot and cold data storage and archival.
- Worked on PIG, HIVE, Sqoop and developed light weight jobs for interacting wif hdfs data.
- Initially worked on POC for Casandra, hbase before choosing mongodb as prefered solution.
Environment: Mongovue, MongoDB compass, MMS, Robomongo, Mtools, Mongoose, mongosqld, vagrant, Oracle virtual box, Remote desktop connection, apache benchmark, siege, xdebug, Toad, Terada sql assistant, phpmyadmin, autosys IXP, notepad++, phpstorm, Zend, oracle datapump, RMAN, RAC, Maximo, jira, nexus, remedy.
Confidential, Charlotte, NC
Data warehouse and production delivery
Responsibilities:
- Migrated huge amount of operational data (~500 TB) from one Teradata to DR, analytical, and archival Teradata platforms wifin data warehousing domain.
- Developed various wrappers and analytical queries using Perl, UNIX bash scripting, python, informatica and Teradata native utilities BTEQ, ARC, and TPT, Fast export, Fast import and multi load.
- Developed various wrapper scripts on top of Teradata datamover utility, for automatic creation of database and tables, space forecast, grants and revoke on target Teradata platforms.
- Developed reconciliation and throttle limit scripts using python to allow and limit concurrency.
- Analyzed thousands of analytical Teradata tables, reindexed and data modeled in DW domain for identifying and moving proper delta partitions to DR, analytical and archival Teradata platforms.
- Migrated hundreds of legacy mainframe tape jobs that run on mainframe into Teradata on UNIX.
- Developed, reviewed and deployed thousands of production jobs using xml, autosys, CA7 mainframe, and endeavor and supported teh same.
- Developed automated heat map reports, mail generation process to app teams, capacity management reports, privacy and metadata using php, perl/CGI, networking, oracle and mysql.
- Developed PHP and mysql based web applications for data lineage, DR and data move pipeline.
- Have full understanding of Teradata bar, DSA, nparc and Query grid operations.
- Worked on LDAP, Active directory and Kerberos migrations and have full noledge of it.
- Developed, tested and deployed pig Latin scripts, hive tables on Hadoop for analyzing and storing daily automated web application log data.
- Developed UNIX shell scripts and used apache sqoop to move Teradata data to Hadoop for cold storage as part of tactical solutions.
- Have basic noledge on Hbase and tested light weight jobs to handle metadata information.
- Had integrated some oracle data into Hadoop and vice versa, have worked on defining technical standards for huge Hadoop eco system.
Environment: Teradata sql assistant, viewpoint, autosys IXP, endeover, CA7, Informatica, Hadoop DM express, HUE, AB, siege, Phpstorm, PHPadmin, MYSQL workbench, TOAD, nexus, jira, remedy, maximo.
Confidential
Production support analyst
Responsibilities:
- Identified code issues in java, Perl, PHP applications and provided fixes to development teams.
- Analyzed oracle, mysql data part of resolving user tickets and proposed proper indexing mechanisms to dev teams. Logged teh progress of fixes in jira and hp quality center.
- Performed and automated oracle and mysql backups using RMAN, datapump and mysql import and export tools. Have monitored oracle performance using SQLTEXPLAIN tool.
- Contributed in change control management for deploying code fixes using Maximo.
- Closed number of user defects and provided accurate solutions to users wifin SLA using jira.
- Developed automated excel macros and graphs for tracking daily user defect status.
- Migrated small legacy application that generates reports using python and mysql into perl.
Confidential
L1/L2 production support
Responsibilities:
- Monitored hundreds of 24/7 Autosys production jobs for 3 systems that uses perl scripts, Maintained teh exposure for teh feeds that arrive from different SOR are calculated, teh Collateral was aggregated for teh various trades and displayed in to teh web based system.
- Refactored internal GUI applications and backend code to display credit risk calculations for different source feeds using HTML, CSS, network PHP and Mysql, Sybase and oracle and supported teh production jobs using autosys.
- Developed number of algorithms for MTM and credit risk calculations in Perl and oracle.
- Manual tuning and performance methods applied on mysql tables and prepared complex views.
- Supported and escalated L1/L2 production issues on time and maintained SLA.
- Contributed several value adds such as stats script, contingency setup and monitor, Nan Check, reduced manual activity and increased TEMPeffectiveness and fixed all bugs in user reports.
- Documented commonly recurring defects in operation handbook which served as defect bible for functions of various production job failures.
- Contributed software engineering expertise in teh development of products through teh software lifecycle, from requirements definition through successful deployment. Worked as L1/L2 on call support on 24/7 rotational basis.