ico-magnifying_glass
    ico-arrow-big-left

    Culture Seeds Ideation Challenge

    PRIZES

    1st

    $2,000

    2nd

    $1,000

    3rd

    $500

    Register
    Submit
    Next Deadline: Review
    3d 0h until current deadline ends
    Show Deadlinesicon-arrow-up

    Challenge Overview

    Prize Distribution

    1st place -  $2000
    2nd place - $1000
    3rd place -  $500

    Checkpoint Prizes: $100 each for up to 5 winners. The checkpoint submission due at Jan 13, 2019, 11:59:59 PM EST.

    Challenge Overview

    In this challenge, we are interested in how to determine the number of people in an organization who have a piece of information (information could be aware of a concept) and how many of them are adopting the information (could be putting that concept into action). We would like to see innovative ideas to quantify these in a relatively objective and automatic way.

    Task Detail

    The task in this challenge is to determine how many people in an organization have a piece of information and how many of those people are adopting the information (ie - living the values of the information). 

    In more detail, we want to understand the overall reach of a program that is creating new information after 1-year (and periodically beyond). This should be a meaningful number that can be explained how we reached it. The ideal solution will not require lots of manual work, but we are looking for the best approach so realize something may be necessary and are open to it.

    Things to Consider: Few things that could play into our approach:

    • There are likely 2 types of people - those who are aware (have heard the information) and those who are active (those who are living - to some extent - the values of the information). We could consider more types as well, for example - aware could be further broken down into two categories: those who are aware who could actually explain the information to others, those who are aware but can’t do anything with the information because it is so surface level
    • Another way to look at types of people - passive vs passionate person
    • Type of communication could influence the adoption - written vs verbal communication, intimate vs large group setting when receiving information, etc
    • “The information” could come in many forms as well. The information includes a variety of values, and it is not necessary that people adopt all of the values to be counted, demonstrating even a subset of the values should be considered adoption for these purposes 

    Background information on the program: We part of are an internal strategic initiative for a large (170k+ people) global organization. The initiative is focused on shaping behaviors through information, tools, frameworks, etc. Our methodology is to surround the organization with this information- so we have outreach to execs (tops down), individual contributors (bottoms up) and strategic segments (sides). See the image for visualizing this.

    Here is some of what we have done to surround the organization.

    • We have hosted roundtables with subsets of execs and their direct reports
      • Total execs touched about 6
      • Total roundtable attendees about 80
    • There are 2 business units that have been selected to receive the information through direct programming
      • For 1 of those business units, only 177 people of about 20,000 were offered to receive the information through scheduled meetings (note - not everybody attended these meetings)
      • For the other business unit, the entire population, 4,000,  was asked to receive the information through scheduled meetings (note - not everybody attended)
    • We have an open network and individuals can elect to join (grassroots portion). Some of these individuals participate with the information every day and play an active role in spreading the information to their networks (which we may or may not touch directly). Some only occasionally participate but understand to an extent what the information is. Others may have little understanding of the information but had good intentions when joining the network.
      • 220 evangelists
    • We regularly post videos that share thought leadership and may alert people as to the intent of the program but the detailed information is not included entirely
    • We send monthly mailers to the entire company that include snippets of the information
    • We have set-up tables to hand out one page summaries of the information
      • This was done for about 1 hour two different times
      • Assume 500 people took the one pager - not necessarily consumed or now live the values - may have been encouraged to join in another way perhaps (these would be counted in other numbers)
    • We have hosted a handful of live sessions which were highly engaging and shared many pieces of the information
    • Note - we know of a handful of situations where people we have worked with have taken the information and presented the information to another group - of varying sizes - on their own accord. There are very likely situations similar to this that we don’t know about. All to say - there are other ways the information is spreading that does not come directly from us.
    • We release information in waves. New type of information every month - how to measure each concept/information against each other to see what is catching on versus what is not.

    Data Description

    We have to support these activities (note - a lot of data is manually maintained right now and mass data sets are not owned by this team, so we are trying to be strategic about how we produce this number vs actually calculating it - hence this challenge).

    • Count of 1-1 or small group interactions between our team and individuals (includes above activities + more one-off interactions). We call these “meaningful impressions” and are only those meaningful direct interactions with our team.
    • Count of attendees on large sessions we have held (note that duplicates are difficult to impossible to remove if someone attends multiple sessions - which is likely)
    • Count of people in the open grassroots network
    • Count of people in the open grassroots network who are actively engaged (we have another tool we built where these people can log activities (this list can be shared if meaningful))
    • Followers on our internal intranet page (note we do not actively use or post on this, but it exists so I mention it)
    • Count of people who have installed a tool built by us to reinforce one piece of the information (note - this does not mean everyone is using it and it is possible for a leader to install on behalf of a team). The tool allows people to submit a piece of information so may be possible to count those who take action vs just see the result of the installed tool.
    • Count of people who have opened our monthly mailer, those who read it <7 seconds, those who read it 7-14 seconds, those who read over 14 seconds, those who opened all the links
    • Count of people who have watched a video posted by us (note we cannot tell unique values here either)

    Possible Approaches we have thought about (this should not limit your approach it is only informational):

    • Randomly survey a random subset of the organization and ask if they have heard about the information and try to understand how/where/who they got the information. We could then potentially deduce the degrees of separation the information traveled from the source (note this could be time intensive and may not accurately demonstrate the levels of adoption without many survey questions and the longer the survey the less likely a received response, which is a concern anyways)
    • Directly calculate based on the counts above (note we feel this will under-represent the spread and adoption of information)

    Final Submission Guidelines

    Submission

    Contents

    A document with details for the proposed algorithm and/or a proof of concept solution, pseudo-code or any documentation/ previous research papers that helps illustrate the proposal

    The final submission should be a report, more like a technical paper. It should include, but not limited to, the following contents. The client will judge the feasibility and the quality of your proposed likelihood function.

    1. Title: Title of your idea
    2. 
    Abstract/Description: High-level overview/statement of your idea
    ----2.1 Outline of your proposed approach
    ----2.2 Outline of the approaches that you have considered and their pros and cons
    ----2.3 Justify your final choice
    3. Details:

    ----3.1 Detailed description. You must provide details of each step and details of how it should be implemented
    --------3.1.1 Description of the entire mechanism
    --------3.1.2 The advantage of your idea - why it could be better than others
    --------3.1.3 If your idea includes some theory or known papers;
    ----3.2 Reason why you chose
    ----3.3 Details on how it will be used
    ----3.4 Reference to the papers of the theory
    ----3.5 Reasonings behind the feasibility of your idea
    4. Appendix(optional):
    ----4.1 Bibliography, A reference to the paper, etc.

    Checkpoint Submission

    In this challenge, we allow checkpoint submissions. In the checkpoint submission, please at least include the “Abstract / Description” part.

    Final Submission

    In the final submission, you must submit all items described in Contents section above.

    Format

    • A document should be minimum of 2 pages in PDF / Word format to describe your ideas.
    • It should be written in English.
    • Leveraging charts, diagrams, and tables to explain your ideas is encouraged from a comprehensive perspective.

    Judging Criteria

    You will be judged on the quality of your ideas, the quality of your description of the ideas, and how much benefit it can provide to the client. The winner will be chosen by the most logical and convincing reasoning as to how and why the idea presented will meet the objective. Note that, this contest will be judged subjectively by the client and Topcoder. However, the judging criteria will largely be the basis for the judgment.

    • Innovative Ideas (50%)
      • How innovative your solution is, compared to those considered solutions as mentioned above?
      • What’s the major advantage of your solution?
      • Why it’s more effective in practice?
      • Please try to use existing theories/papers to justify.
    • Feasibility (40%)
      • Is it easy to implement your solution in our large organization?
      • Is there any additional cost of doing that?
    • Clearness (10%)
      • The report must be well-written and easy to follow.

    Note: We will be looking mainly at “Abstract / Description” for the checkpoint submissions.

    Submission Guideline

    You can submit at most TWO solutions but we encourage you to include your great solution and details as much as possible in a single submission.

    Reliability Rating and Bonus

    For challenges that have a reliability bonus, the bonus depends on the reliability rating at the moment of registration for that project. A participant with no previous projects is considered to have no reliability rating, and therefore gets no bonus. Reliability bonus does not apply to Digital Run winnings. Since reliability rating is based on the past 15 projects, it can only have 15 discrete values.
    Read more.

    REVIEW STYLE:

    Final Review:

    Community Review Board
    ?

    Approval:

    User Sign-Off
    ?

    CHALLENGE LINKS:

    Review Scorecard