We provide IT Staff Augmentation Services!

Lead Programmer Analyst Resume

2.00/5 (Submit Your Rating)

King Of Prussia, PA

PROGRAMMING SKILLS:

Languages: C, Java, Perl, Ruby, PL/SQL, Matlab, R, SAS

Databases: Cassandra, Oracle 10g, MySQL, SQL Server 2008, DB2, Teradata, Access

Tools: ElasticSearch, Talend, Informatica, Weka, ERwin, TOAD, Git

OS & Hardware: Unix, Linux, DOS, Windows, MVS/ESA, Z - OS, IBM 390 Mainframe

WORK EXPERIENCE:

Confidential, King of Prussia, PA

Lead Programmer Analyst

Responsibilities:

  • Design and develop algorithms, prototypes and technical specifications for products and product extensions built on big data platform
  • Architect and develop solutions including design review, code development, review and unit testing for aggregate spend solutions handling large healthcare datasets and real time search using a mix of Oracle, Cassandra and ElasticSearch
  • Design, develop and administer solutions for commercial data management (CDM) and Master Data Management (MDM) solutions
  • Develop algorithms and match plans (probabilistic/deterministic) for building client master data, matching and consolidations using business rules dictated by customer requirements for federal reporting, golden record creation etc.
  • Analysis and modeling of large structured and unstructured documents and datasets to determine associations and clusters for in-house solutions, sales and marketing
  • Develop solutions and webservices to read and post data using REST interfaces for matching, validation and reporting needs.
  • Lead the team in design, development, coding, implementation and integration of in-house aggregate spend solution that caters to consumption of spend data in various sources and formats, data cleansing, data validation using business rule validation, develop and review scripts that perform exception rules and metric reports
  • Initiated and lead teams and projects to improve efficiencies in coding standards, proper version control and source code management using git and GitHub
  • Develop and maintain various solutions that cater to data validation, data massaging and data scrubbing using scripting and ETL solutions
  • Implement, integrate and automate various solutions and add-ons for deployment, integration, technical validations, monitoring, health checks and log stash analysis
  • Analyze, modify and develop code for various components of solution implementations, propose software changes, upgrades, patches, bug fixes, data quality identification and remediation on the company’s technology stack
  • Analyze system and solution performance, throughput, response time, make recommendations, and implement improvements
  • Develop and implement reporting solutions and data visualization for internal and external clients against large datasets for state and federal reporting needs
  • Lead project teams in requirements gathering, design and development, facilitate creation of work orders, change orders and high level and low level technical and functional documents
  • Actively work with Scrum masters, business analysts, project managers, solution leads and quality analysts to assess requirements, challenges, timelines, lead code reviews for implementation and enhancement projects
  • Business and working knowledge of multiple facets of healthcare data from various industry sources, state and federal sources, spend transactions from sources such as CTE, CRO feeds and regulations pertaining to Open Payments (SunshineAct)
  • Technical Lead for design, implementation and integration of HMS solutions in Life Sciences, Payer, Provider, Government and Global clients in the healthcare sector

Confidential, Denton, TX

Web Developer and Database Administrator

Responsibilities:

  • Development and maintenance of a web-based interface for capturing scholarly activities of Faculty and Graduate Students of UNT.
  • Responsible as application developer and DBA on MS SQL Server and MySQL databases, requirements elicitation, access control, migration, ad hoc data loading and report generation.
  • Reviewed and implemented appropriate statistical analysis for objects as needed, such as items relevant to histograms, ranges of data
  • Experience in the design, development, and support of relational and data warehouse database designs in a team oriented environment

Confidential

Software Engineer

Responsibilities:

  • Team Lead for Data Warehousing, Data Mining and Business Intelligence tools
  • Handled projects in Extract-transform-load (ETL), performance tuning and Query Optimization for Health Insurance Clients using Informatica
  • Analysis and tuning pl/sql packages, functions, stored procedures and ad-hoc SQL through the use of explain plans, SQL trace, optimizer hints etc.
  • Analyzed and Created Facts and Dimension Tables.
  • Developed Informatica mappings, re-usable transformations, re-usable mappings and Mapplets for data load to data warehouse and database
  • Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance
  • Proposed and implemented recommended physical object structures and their design, i.e. materialized views, indexes, partitioned tables etc
  • Responsible for analysis, design, coding, testing and implementation and maintenance of application software on IBM Mainframes for several projects
  • Experience in UNIX and writing shell scripts for Informatica pre & post session operations and database administration activities
  • Design and develop scripts to read files from a leading medical claims clearing house to be used in-house as a source for various analysis
  • Read a single line file with millions of records, parse and transform data into various buckets to determine attending physicians, point of service, payer, de-identified patient information
  • Faculty Profile System - Development and maintenance of a web-based interface for capturing scholarly activities of Faculty and Graduate Students of UNT as required by Texas House Bill.
  • Responsible as DBA, worked on requirements elicitation from various heads of the department and other administrative officers, providing access control, data migration from MS Access to FPS, ad hoc data loading and report generation
  • Developed the portal using PHP as the front end and used MySQL as the back end.
  • Code also used LDAP access and binding for authorization and validation of users
  • Used SOAP for information exchange for search purposes
  • Used Perl to create scripts that aided in creating a query interface on FPS, formatting, scp, loading of primary identifiers to datasets during data migration from MS Access tables and report generation in xls/csv formats.
  • Created indexes on tables, esp. publication and projects tables which are high volume data and mostly accessed for viewing and for report generation
  • Project has several components which are Information Retrieval, Entity Recognition, Relationship extraction, Hypotheses generation and Visualization
  • Created a CGI interface for the C-Engine to provide search terms that would be passed to public databases. Displayed the list of publications along with author and journal information on next (result) page. Retrieved all matching publications from those databases, which are stored for processing in a local server
  • Developed script to invoke a crawler, which goes to the above databases and grabs the physical pages and splits the page into text and image files
  • Written scripts to identify entity of interest (gene/protein) from the files downloaded using set of rules and learning algorithms and filter out false labels
  • Created algorithms and scripts to determine relationships between the two entities of an interest from a single sentence and hypothesize a set of implicit associations from the explicitly mentioned associations using association rule mining
  • Used SQL server to create local databases containing protein tables, created stored procedures to execute programs depending on the query term passed as a parameter
  • Visualization is done using NodeXL addendum to Microsoft Excel
  • Iris localization used to identify and track Iris part from an image of an eye using Hough Transformation. Code written in C and run on Matlab
  • Normalization performed using Daughman’s Rubber sheet model
  • Matlab coding for feature encoding using 1D log-Gabor filters
  • Weka classifier used for classifying various images and assigned labels
  • Hamming distance used to compute similarity measure
  • An interface to provide an image query and display the matched results were developed using VB.net

We'd love your feedback!