5. Tutorial 2 - The 5Rs in action#
The object of this tutorial is to examine the code you created in tutorial one and assess how you would “score” your code with respect to the 5Rs. In addition to this, you will share your code with your peers so that they can
Re-run your code and Reproduce your results
Re-use your code in their own code
Your code will then be assessed by your peers with respect to Re-runbility, Re-usabilty, and Reproducability.
You might find the following table useful. Copy and adapt it as you see fit.
Replicability |
Reproducibility |
Repeatability |
Re-usability |
Re-runability |
|
---|---|---|---|---|---|
Strengths |
It worked out of the box with |
Smashing super lovely |
|||
Weaknesses |
|||||
Improvement |
5.1. Task 1#
Take 5 minutes to assess your own code with respect to the 5Rs. You should make two assessments. One which covers your strategies, and another which covers your competition code.
5.2. Task 2#
A folder has been created using your name on the stor-609 Teams group. Add your strategies to this folder.
You will be grouped with one or more other students. Access their strategies from the stor-609 Teams group.
Now try running their strategies in your competition code and consider the following questions -
How easy was it to use their code in your competition ?
What changes did you have to make to their code to get it to work.
How will you check if you get the same results as they did ?
What interface do their strategies provide ? i.e. what inputs are required and what is the output
Now make an assessment of their strategy code with respect to the 5Rs. Record your assessment and upload it to the students folder on the stor-609 Teams group.
5.3. Task 3#
For this task you have to upload your competition code to your folder on the stor-609 Teams group.
Now download the competition code from the students you have been grouped with.
You now need to
Adapt their competition code to run your strategies.
Run the competition code with both your strategies and theirs.
Work out who has got the best strategy !!
Adapt their competition code to run a competition using a ten sided dice, where the round score is lost when a 4 is rolled, and the target score is 300.
Now make an assessment of their competition code with respect to the 5Rs. Record your assessment and upload it to the students folder on the stor-609 Teams group.
5.4. Task 4#
You now need to adapt your own competition code to compare strategies for the following game.
Same as pig except that a ten sided dice and a four sided dice are rolled. You score the sum of the values on the dice unless the values are the same in which case you loose your round score.
Now consider the following questions -
How much of your code did you have to change to make this possible ?
Which parts of the code did these changes affect ?
How could you refactor your code so that if I gave you a new game specification it would be easier to Reuse your competition code ?
5.5. Optional extra task#
Navigate the the Journal of Statistical Software website. Do a search for articles with “python” in their title.
Most of the articles in Journal of Statistical Software provide replication code which can be used to Replicate the main results of the article i.e. figures, tables, and so on. Select one or more of these articles and see if you can Re-run the Replication code. Keep a record of any additional steps you had to take to get the Replication code to work that are not documented in the article or supporting materials. How old is the publication ?
If you prefer, you do this exercise for R based articles or any other programming languages you are familiar with.
If you try doing this for more than one article, pick an old one (more than 5 years) and a relatively new one.
Good luck !!
5.6. Resources#
5.6.1. Roadmap#
Quaity |
Description |
Methods |
Documentation |
---|---|---|---|
Reusable |
code can be easily adapted to variations in problem specification |
design patterns, packages / libraries |
in code documentation / api documentation |
Re-runnable |
someone else can run the code |
packages / libraries / repositories |
installation instructions and scripts, |
Repeatable |
same results over time |
unit tests / maintainance / version control |
use cases |
Reproducible |
same results for a given problem |
unit tests |
case stdies / articles / reports |
Replicable |
algorithm / solution can be recoded by someone else |
pseudo code |
articles / books / reports |
5.6.2. Best Practice#

5.6.3. Further Reading#
Best Practices for Scientific Computing
Developing Scientific Sortware