What's new
AllBuffs | Unofficial fan site for the University of Colorado at Boulder Athletics programs

This is a sample guest message. Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

  • Prime Time. Prime Time. Its a new era for Colorado football. Consider signing up for a club membership! For $20/year, you can get access to all the special features at Allbuffs, including club member only forums, dark mode, avatars and best of all no ads ! But seriously, please sign up so that we can pay the bills. No one earns money here, and we can use your $20 to keep this hellhole running. You can sign up for a club membership by navigating to your account in the upper right and clicking on "Account Upgrades". Make it happen!

Which Big 12 member is the team you most want to beat once we join?

Who do you most want to dominate?

  • Baylor

  • Iowa State

  • Kansas

  • Kansas State

  • Oklahoma State

  • Texas Christian

  • Texas Tech

  • West Virginia

  • One of the new P5 additions (BYU, Cincy, UH, UCF)

  • Some other school from Pac, Independent or G5 you hope joins


Results are only viewable after voting.
Take that 3.14159!

Pie GIF by Squirrel Monkey
 
I vaguely recall some pedagogical research about overly precise language and jargon reducing the effectiveness of communicating and learning.

This also brings to mind the concept of model parsimony, where you can technically improve how well your model fits a set of data but tend to introduce some loss in prediction accuracy by adding more predictor variables, so you prefer the fewest number of inputs that gives you an adequate output.
Thank you for this excellent example of my point.

UBL, I presume.
 
I vaguely recall some pedagogical research about overly precise language and jargon reducing the effectiveness of communicating and learning.

This also brings to mind the concept of model parsimony, where you can technically improve how well your model fits a set of data but tend to introduce some loss in prediction accuracy by adding more predictor variables, so you prefer the fewest number of inputs that gives you an adequate output.
Parsimony sounds kinda sweet and sexy.

Pedagogy sounds dirty and illegal.
 
Back
Top