Comments

niplav20

Back then I didn't try to get the hostel to sign the metaphorical assurance contract with me, maybe that'd work. A good dominant assurance contract website might work as well.

I guess if you go camping together then conferences are pretty scalable, and if I was to organize another event I'd probably try to first message a few people to get a minimal number of attendees together. After all, the spectrum between an extended party and a festival/conference is fluid.

niplav20

Trying to organize a festival probably isn't risky. It doesn't seem like it'd involve too much time or money.

I don't think that's true. I've co-organized one one weekend-long retreat in a small hostel for ~50 people, and the cost was ~$5k. Me & the co-organizers probably spent ~50h in total on organizing the event, as volunteers.

niplav31

That's unfortunate that you are less likely to come, and I'm glad to get the feedback. I could primarily reply with reasons why I think it was the right call (e.g. helpful for getting the event off the ground, helpful for pinpointing the sort of ideas+writing the event is celebrating, I think it's prosocial for me to be open about info like this generally, etc) but I don't think that engages with the fact that it left you personally less likely to come. I still overall think if the event sounds like a good time to you (e.g. interesting conversations with people you'd like to talk to and/or exciting activities) and it's worth the cost to you then I hope you come :-)

Maybe to clarify my comment: I was merely describing my (non-endorsed[1]) observed emotional content wrt the festival, and my intention with the comment was not to wag my finger at you guys in the manner of "you didn't invite me".

I wonder whether other people have a similar emotional reaction.

I appreciate Lightcone being open with the information around free invitations though! I think I'd have bought a ticket anyway if I had time around that weekend, and I think I'd probably have a blast if I would attend.

Btw: What's the chance of a 2nd LessOnline?


  1. I think my reaction is super bound up in icky status-grabbing/status-desiring/inner-ring-infiltrating parts of my psyche which I'm not happy with. ↩︎

niplav20

Oops, you're correct about the typo and also about how this doesn't restrict belief change to Brownian motion. Fixing the typo.

niplav83
  1. Putting the festival at the same time as EAG London is unfortunate.
  2. Giving out "over 100 free tickets" induces (in me) a reaction of "If I'm not invited I'm not going to buy a ticket". This is perhaps because I hope/wish to slide into those 100 slots, even though it's unrealistic. I believe other events solve this by just giving a list of a bunch of confirmed attendees, and being silent about giving out free tickets to those.
niplav40

Because[1] for a Bayesian reasoner, there is conservation of expected evidence.

Although I've seen it mentioned that technically the change in the belief on a Bayesian should follow a Martingale, and Brownian motion is a martingale.


  1. I'm not super technically strong on this particular part of the math. Intuitively it could be that in a bounded reasoner which can only evaluate programs in , any pattern in its beliefs that can be described by an algorithm in is detected and the predicted future belief from that pattern is incorporated into current beliefs. On the other hand, any pattern described by an algorithm in can't be in the class of hypotheses of the agent, including hypotheses about its own beliefs, so patterns persist. ↩︎

niplav44

Thank you a lot for this. I think this or @Thomas Kwas comment would make an excellent original-sequences-style post—it doesn't need to be long, but just going through an example and talking about the assumptions would be really valuable for applied rationality.

After all, it's about how much one should expect ones beliefs to vary, which is pretty important.

niplav60

Thank you a lot! Strong upvoted.

I was wondering a while ago whether Bayesianism says anything about how much my probabilities are "allowed" to oscillate around—I was noticing that my probability of doom was often moving by 5% in the span of 1-3 weeks, though I guess this was mainly due to logical uncertainty and not empirical uncertainty.

Since there are 10 5% steps between 50% and 0 or 1, and for ~10 years, I should expect to make these kinds of updates ~100 times, or 10 times a year, or a little bit less than once a month, right? So I'm currently updating "too much".

Load More