Dashboards. Reports. Macros. Apps.
My favourite projects, including report development, application development, and visualization development. Most projects will be available on GitHub soon, working on it!
A sample of what I do with Excel to extract, transform and load data and give meaningful takeaways.
Pricing Upload Process Optimization
The ETL process to upload the pricing into the system was manual, lengthy and with a high rate of inaccuracies. It used to take 4 hours or more to process dozens of files, and prices would be regularly changed during that timeline.
To address that, I created an Excel template I called EZ-PLU (Price Look-Up) to extract the data from the buyer's Excel files and add extra information input on the template to produce a .csv file that would be uploaded to the AS/400 system with the new promotional pricing.
Situation
The ETL process to upload the pricing into the system was manual, lengthy and with a high rate of inaccuracies. It used to take 4 hours or more to process dozens of files, and prices would be regularly changed during that timeline.
Task
I took the initiative to automate the process by using macros to perform all manual steps, thus saving processing time and giving the buying team the opportunity to make last minute changes without delaying the delivery of the CSV files to IT.
Action
The process was a monotonous loop of copying, pasting, formatting, manual checking, and exporting at least 12 Excel files. A macro was created to open up to three files in a loop, process them and exporting the CSV files to be sent to IT for upload to the AS400.
A master sheet would reconcile the imported/exported data in order to find any discrepancies before sending the data to IT for upload.
One of the challenges to address was the tendency of the buying team to change the structure of the PTS (Plan to Sell) files. This would cause the macro to copy/paste data in wrong places. The solution to this was to find header values and set columns/rows based on that search.
Results
After significant decrease in processing time, from several hours to less than 3 minutes, last minute changes were not of further concern and accuracy was increased significantly in flyers and signing data upload.
Later, a new IBM pricing system was proposed and when the time of processing was determined to be as lengthy as before this project, thus management pushed back pointing out that the macro should be kept or another, less manual and more efficient process should be implemented.
Technical Data
File Dialogs to get file paths.
If statements to check for pricing discrepancies.
Index+Match to lookup pricing in any direction.
I started developing dashboards with Tableau in my Sears days around 2014 and Power BI in 2020 just before COVID-19 started.
Power BI Personal Protective Equipment Trackers
In 2021 I was hired by the Government of Ontario through a recruiter agency to work on Power BI dashboards development and maintenance, used to track down COVID-19 PPE orders (Personal Protective Equipment), for specific Ontario Ministries that required these types of products, such as schools, courts, offices. I did some Power Query and DAX maintenance and customization of these to fit specific client needs. Additionally, I provided recommendations on process improvements and data accuracy by the use of macros to detect and correct data discrepancies among data sources. I also assisted with SharePoint management.
Situation and Task
Ontario Government Ministries needed to visualize how effectively suppliers of Personal Protective Equipment (PPE) were performing. Since most ministries had different requirements for their visualizations, each dashboard needed to be customized to accommodate those specific requirements.
In addition to that, the data was very differently sourced and inconsistent, even with data entered with typos.
Action
After a series of meetings with the stakeholders, lists of specifics KPIs were created and the dashboards were updated accordingly to meet each Ministries’ needs, by editing SQL queries and DAX formulas.
VBA Macros were created to detect and fix data inaccuracies, the code was constantly being updated as more discrepancies were found after data reconciliation. The master data file was hosted in a SharePoint data reposiotory for easy access to dashboards and stakeholders.
Results
Ministries were able to address delivery issues by contacting the supplier in a timely manner with very precise information when a supplier’s poor performance was detected. Users were able to track single orders and address the situation for a specific shipment with extreme accuracy.
Visual Studio projects (C+), Autohotkey apps to emulate keyboard and mouse tasks, among other things.
AR Tool Add-In Restoration in Visual Studio
Situation and Task
An existing Auto Replenishment Tool developed in Visual Studio was rendered unusable after its custom data table was decommissioned. The tool was an Excel Add-in that would download the data into a spreadsheet, the planner updated the On Order numbers and then uploaded the data, to the source, the AS400 system.
The table was created via a SQL query run automatically on a daily basis.
Action
After finding the source code and Visual Studio project in the archives, I reversed engineered the query that created the data source table, by using the last results that I could find, create a new SQL query with those results, and then recreated the data source table internally in Visual Studio by inserting an additional SQL query in the project.
Results
The add-in was restored and the planner was able again to download the data into Excel, update the numbers and upload them using the AR tool.