This page provides information about Go Code Colorado 2019. Check back early 2020 for information about the next Go Code Colorado.
GO CODE COLORADO JUDGING CRITERIA
This Go Code Colorado judging criteria is to be used by Go Code Colorado Judges (“Judges”) in choosing winners of the challenge. 2019 marks the first year of two different competition tracks, and this criteria and guidelines, along with the related technical and data evaluation sheets, detail the requirements for submission.
Semi-Final and Final Competition judging will be based on material provided in each team’s GitHub repository in the Go Code Colorado GitHub Organization. Scoring will be a weighted composite of the 5 factors shown in tables below, detailed scoring from the technical evaluation and data evaluation, and material covered during team presentations.
Business Decision Makers
A Business Decision Maker (BDM) is someone within a business responsible for strategic and/or tactical decisions. The competition is funded by the Colorado Secretary of State’s Business and Licensing Division, and therefore the focus of this competition is on businesses. Non-Profit entities can be considered businesses for the purpose of the competition, so long as the solution still tracks toward the BDM of that entity. Creating a solution for a Government Entity is entirely outside of the scope of the competition.
Judging Criteria Category Weights
Judging Criteria Tracks – Side by Side
2019 Challenge Statement
Use public data to deliver solutions to or insights for a business decision-maker.
BUSINESS PRODUCT TRACK – In the Product Track, teams compete to create market-relevant applications or tools – using public data – that provide solutions for business decision makers in Colorado.
BUSINESS ANALYTICS TRACK – In the Analytics Track, teams compete to analyze Colorado’s public data to provide useful insights that benefit business decision makers in Colorado.
The Difference Between the Analytics and Product Tracks
- Allows market-relevant web/mobile application user (BDM) to come to their own decision from the data and information as presented in the product.
- The BDM interprets the data story based on their individual needs and use cases guiding their use of the application.
- Product tech score is a composite of (1) user experience, (2) sustainability, (3) functionality, (4) logical structure.
- Product data score value scale: (1) increasing accessibility (2) combination (3) analysis.
- Deliver 5 minute pitch presentation with 5 minutes for questions at the Final Event.
- Results are presented as code and related documentation for market-relevant web/mobile application, and must be understandable and rebuildable from the repo.
- Must be automated, and any manual work is only proof of concept for future functioning additions.
- Has concrete findings, and concrete insights (data, information, knowledge) that are pertinent to a BDM.
- Results are conveyed through the telling of the data story by the analyst and presenting as a question and answer for the BDM.
- Analytics tech score is a composite of (1) accuracy (2) confidence (3) repeatability.
- Analytics data score is a composite of (1) data storytelling, (2) data combinations, (3)data analysis type
- Presentation of data story at Final Event consists of a 5 minute tabletop widescreen demo, with 5 minutes of questions.
- Results are presented as code and related documentation for digital interactive or static outcomes.
- Can be automated, but does not need to be. Can be entirely manual.
1.) INNOVATION/RELEVANCE – 20% OF SCORE
Product – Identify a new and different way to provide BDMs with resources through a tool that helps them solve a real problem.
Analytics – Design a data analysis that seeks to address a problem or answers a question relevant to a real problem for BDMs.
2.) IMPLEMENTATION/DATA STORY – 10% OF SCORE
Product – Design and develop a tool based on the BDM as the primary user. All entries should assist the user in easily navigating the tool to find relevant info pertinent to them. The tool must be functional and publicly accessible by Semi-Final submission, with a clear roadmap of how all proposed components would be completed by the Final Event.
Analytics – Cleary define the issue and identify the specific question being analyzed. All entries should be easily digested with a clear and simple story that can be communicated effectively. The final analysis must be digital and publicly accessible by Semi-Final submission, with a clear roadmap of how any additional insights would be completed by the Final Event.
3.) QUALITY/TECHNICAL EVALUATION – 20% OF SCORE
Product – Ensure the product is well-built, has useful functionality that will be easily adopted by BDMs, and is based on the technology or code structure that are sustainable and scaleable. The final output must be a minimum viable product (MVP). This means a product that is available for use by public users. It should also allow for insights to be easily drawn from the experience. Both Semi-Final and Final submissions will be investigated for technical completeness. Product Track Technical Evaluation.
Analytics – Ensure the analytical method is valid and appropriate to the question being answered, and the design is Innovative, comprehensive, and efficient. Also that the model has an acceptable margin of error, a repeatable methodology, and the goals and conclusion of the analysis are clearly defined. The final output is easily digestible by the target audience and conveys the message with minimal to no additional interpretation. Both Semi-Final and Final submissions will be investigated for technical completeness. Analytics Track Technical Evaluation.
4.) PRESENTATION – 10% OF SCORE
Presentation scores for both the Semi-Final and Final rounds will be based on the materials provided in the team’s GitHub repository as they relate to what is presented. The Judges goal is to understand the team’s concept and/or analysis and results.
Semi-Final Presentation scores will be based on a recorded screencast of presentation slides with narrated audio as a “Virtual Presentation” and submitted in accordance with instructions found at GoCodeColorado GitHub Knowledge Base Semi-Final Submission Guidelines.
Finalist Presentation scores will be based on in-person delivery of presentation or demo and questions at the Final Event. Judges will have been given access to the competition repo prior to the event. GoCodeColorado GitHub Knowledge Base Final Submission Guidelines.
Product – Successfully explain how the idea facilitates the discovery of insights by the BDM using public data. The Finalist Teams in the Product Track deliver a 5 minute pitch presentation with 5 minutes for questions at the Final Event
Analytics – Successfully explain how the data analysis delivers insights to the BDM using public data. The Finalist Teams in the Analytics Track will present their data story at the final event as a 5 minute tabletop widescreen demo with 5 minutes for questions at the Final Event.
5.) USE OF DATA SOURCES – 40% OF SCORE
Product – Add value to public data and cogently deliver it to a BDM to help them with their decision-making process. Teams must provide descriptions of the data used, data sources and show how they have combined or analyzed the data to create new insights that help a BDM. Product Track Data Evaluation.
Analytics – Add value to the public data and cogently use it as part of an analysis to draw conclusions that answer a question asked by a BDM. Teams must provide descriptions of the data used, data sources, methods for obtaining and preparing the data for analysis and how their methods provide reliable and useful information to a BDM. Analytics Track Data Evaluation.