You are not logged in.
I concur with bobbym, except technically sinθ and tanθ may also be negative (because cosθ=7/25 has two solutions).
I think there are several solutions because there are two solutions to sinθ = 4/5, namely 2pi*n + pi - sin^(-1)(4/5) and 2pi*n + sin^(-1)(4/5). I might guess the one you wanted was 1/2, but I also got the following by substituting every combination:
-1/2, +/-7/4, +/-25/4, +/-25/14, +/-25/3*cos(3-sin^(-1)(4/5)), 25/3*cos(3+sin^(-1)(4/5)), 7/3*cos(3-sin^(-1)(4/5))
Hi! When I graphed it, I found the answer, but couldn't remember how to solve it algebraically.
I don't have a teacher's take on this, but I don't find it too unreasonable in the scheme of bad exam questions. I would have arrived at the same answer bobbym gave, probably because I would be checking specific kinds of rules instead of using my imagination xD
Incidentally, that rule has a minimum of 18.75 when n = 0.5, and
Incidentally again, the rule for the Fibonacci sequence is: ((1/2*(1+√5))^n - (1/2*(1-√5))^n) / √5
While probably difficult to find, it is not all that intimidating to look at (: I don't know yet about the similar rule for this sequence involving √5 though. However, spreadsheets make this process quite simple, if you don't mind approximate answers as the numbers get very long (or spending time manipulating the result to see further decimal places).
I tried the process term(n+1) = term(n) + term(n-1). It says the 300th term will be about 5.8181 * 10^63, or 5.8181 vigintillion. Yikes
Basically you're right about problem three: Bob forgot the acute part of the definition. I just drew the triangles, since there aren't that many possibilities. Triangles where n is 8 to 12 or 18 to 22 are obtuse. 17 is a right triangle. 13 to 16 work so there are four triangles.
Problem four did not seem too difficult. All you have to do is visualise a circular continuum of circles (Yourtown) within a circle (Little Village) around Capital City, and it's clear that if there's only one solution they're in a straight line. I just drew the problem and it was transparent that way. It is neat though.
Guess I will ponder three when I get some sleep xP
Edit: I guess if you saw that 310 + 200 + 150 = 660 that would make it transparent as well, haha. But that is why I need sleep!
Well that's straightforward enough. A player would prefer a chance higher than 2% of getting 50 coins to a 100% chance of 1 coin, since his expectation would then be higher than 1/50 * 50 coins = 1 coin. As for pirate 1, he would need better than a 1 in 49 chance, or about 2.041%, since he will otherwise get 2 coins. But the information he has must lead him to conclude there is such a chance of getting 50 coins, which there won't be if his opponents are totally rational. With real players though, it's very likely indeed Especially when someone values their life much more than any amount of gold.
I suppose in reality 5 pirates is a bit like poker, so it can be very hard to know for sure how to play; it depends who you are playing with.
If we did an experiment, where you were to be pirate 1, and I filled in the rational offers for pirates 2 to 5 (where the other pirates believe that you are rational too), you would quickly see that turning down the initial 1 coin is pointless; the second pirate will be accepted and you will get nothing. But it is true that if we have three or four players who will kill out of spite if they aren't offered enough, and they have a reputation for it, they will be more successful; they will be paid more for their votes. But that's only because they pledge to do something stupid if they're offered 1 coin. The rational players would like to pledge that too, but, being too rational to act stupidly, no one will believe them.
So, in an attempt to explain away your problems with the definition of rationality, the idea is this: Sometimes the best thing is for a rational person to be thought ignorant or unpredictable or unreasonably demanding and vengeful. But in the pirate problem, everyone knows that everyone will always act in their own interests in every situation they could possibly find themselves in. So the game is up. It's not that the pirates are doing something wrong by accepting 1 coin; it's simply that they are doing the best they can given what the other pirates believe about them. They are not making mistakes; they are being truly optimal with their rationality; it's just that their reputation is their undoing. When you introduce some element of unpredictability, however, you will quickly see rational players act differently.
We should get 5 of us together and actually play 5 pirates lol. I assure you, I would take your threats very seriously! (Although it kind of gives you an advantage when I tell you that xD.)
I was assuming that the game was played only once in what I said. But rational players must play the same outcome repeatedly for any finite fixed number of games, which is shown by working backwards; the last game will be played like the only one no matter what, so the second-to-last will be treated as the last and will be as well, and so on through to the first. It's only when the number of games played is infinite, or (more realistically) there is some uncertainty about whether future games will be played, that players might find it better to try other strategies. I did learn how to analyse these indefinite games, but I've forgotten a fair bit and will have to brush up before commenting on an indefinite 5 pirates.
Being rational means that you have to play the best possible response to your opponents in every situation you may find yourself in. But an interesting fact is that it can be a curse if others know that you are rational. The paradox that you've hit on is that being rational doesn't necessarily lead to the best outcome, but you think that should be the definition of rationality. But what you're suggesting is that the pirates threaten to do something counterproductive in the future unless they are paid handsomely for it. They want their opponents to think they are not rational, because then they might fear getting thrown overboard.
It's a bit like if I calmly threaten to trash my dad's house if he doesn't cook me eggs on toast. I would like eggs on toast, but unfortunately he knows I will not do a counterproductive thing like trash the house just to save myself 10 minutes' cooking. So a mad person would get away with it, but a rational person won't (unless they're thought to be mad).
Edit: This example is a bit different to the pirates one because I would not actually be willing to trash the house for a 2% chance of being made eggs. If I think of a better one, I will add it here.
I'll relate it to the actual 5 pirates this time.
The rational solution is 97:0:1:0:2 which is achieved as each pirate buys the voters that are underprivileged by the next pirate for the smallest price.
If the pirates decide to demand 50 coins, and pirate 5 ignores them, and they throw him overboard, and then they play rationally, the outcome is 97:0:2:1. So pirates 1 and 3 have done something counterproductive by killing 5 and cheated themselves out of 1 coin.
If they instead demand 50 again, and pirate 4 ignores them, gets thrown overboard, the new rational outcome is 99:1:0. Once again, pirates 1 and 2 have cheated themselves out of 1 coin.
If they now demand pirate 3 give them 100 coins, and he ignores them and they throw him overboard, pirate 1 would get everything, so once again pirate 2 has cheated himself out of 1 coin.
So as pirate 5 is sitting there scratching his beard as pirates 1 to 4 are demanding 50 coins, he will be thinking to himself: Why would 2 throw 3 overboard? Why would 1 and 2 throw 4 overboard? Why would 1 and 3 throw me overboard? And if he can't answer that question by saying that they are angry or stupid or reckless or they value something more than gold, he will not listen to their demands.
Since it's been 6/7 of a fortnight that this thread has been waiting for a response from OP, I hope no one minds if I post the answers for the heck of it.
In game theory, it is the difference between an equilibrium (where everyone is doing the best they can given what everyone else is doing), and what is called a subgame-perfect equilibrium, which, as the name implies, also has everyone in an equilibrium in all of the possible subgames or alternative points of decision. For example, when only pirates 1 and 2 are left, that is a subgame. When an offer has been made, a new subgame is entered. And so to solve the whole game you have to solve each of these possible situations through a process called backward induction. This is how connect four was solved, and checkers, and perhaps one day quite far from now, chess: by working backwards from each possible ending. There can be several solutions, but only if some players are indifferent between some of their decisions.
And to reiterate, the reason that a subgame-perfect equilibrium is preferred is that a rational player is never allowed to deviate from optimal strategy in any possible situation that can arise, and therefore, ironically, their commitment to future rationality can be used against them in the present.
To get around this problem, a pirate would have to find some tangible way of changing his own or his opponents' motives, his own reward scheme (what is called a payoff structure). So, basically he would have to create a situation where they give him what he wants or they die or lose out, and he can't change this if he wanted to. I have no idea how this would be done on a pirate ship with gold, but some examples include burning your own boats so that your enemy knows you won't turn back, breaking your hand so you can't sign a cheque, going to sleep with the door locked so you don't have to (in fact, and this is the point, can't) deal with something, or drinking alcohol so that you cannot drive. You cannot simply utter a verbal threat or promise if you are totally rational and the other knows what rational people will do, because verbal pronouncements can easily be insincere.
I will grant, however, that with real people, I expect play to be a lot more like you say it is, because people value equity over pitiful amounts of gold.
Hello!
Just to offer you a reply that is relatively short, sharp and to the point to begin with, the answer is that rational players cannot do this. To explain it intuitively, in this game everybody knows what everybody else's motives are, and the players, being rational, cannot disobey their motives. And nothing the players can say to each other will alter their motives. Given that, they cannot blackmail each other because nothing they say has any credibility. Pirates 1-4 might say they will not vote if they're offered less than 50 coins, but if pirate 5 ignores them, they don't have any reason to kill him because they are not in a position to get any more than they're offered. And everyone knows this at the outset. So even though bargaining might work well if you can convince the others you are mad or reckless, if you are truly rational and the others know it, you have zero verbal bargaining power.
The question isn't: why is there something? The universe has no alternative state to an existing one.
The really hard question is: Why do things exist in the precise way that they do?
To Bob - I think your friend has given a very good definition of nothing! Namely:
Nothing is that which isn't worth worrying about.
If you mean the whole thing, I think it's clear there is no simple formula. It's a complete mess. Even the best polynomial of degree ten is an awful fit that accounts for only 27.8% of the variation, which was calculated at desmos.com as approximately:
f(n) = 2.6039·10^-14 n^10 - 9.4352·10^-12 n^9 + 1.5318·10^-9 n^8 - 1.4057·10^-7 n^7 + 0.00000767 n^6 - 0.00024247 n^5 + 0.0038791 n^4 - 0.012534 n^3 - 0.44326 n^2 + 5.0707 n + 10.41
A simpler rule with 5.64% accuracy is -0.0085213 n^2 + 0.5386 n + 19.183
A graph of this is given here: https://www.desmos.com/calculator/u4ccy8dlay
But then, you could say the same about the Fibonacci sequence, and that has a formula of sorts. At any rate, though, I sure can't see one
It seems to me that the original intent of the thread was to multiply the stated number by each of its digits, in ascending order, and then either state a new number and repeat or repeat the process with the product obtained after all digits have been used. That would have yielded a more interesting sequence too. But since I'm here, 525 duovigintillion 810 unvigintillion 643 vigintillion 939 novemdecillion 383 octodecillion 144 septendecillion 408 sexdecillion 655 quindecillion 691 quattuordecillion 666 tredecillion 896 duodecillion 156 undecillion 911 decillion 912 nonillion 285 octillion 969 septillion 118 sextillion 88 quintillion 134 quadrillion 656 trillion multiplied by 54 is equal to 28393774772726689798067407350012392473243263442332376759271424000000000000 or approximately 28.4 trevigintillion. I hope somebody has checked all these digits and verified all the terms of the sequence lol!
I feel really slow, but I have to ask: Why can't he just take only birthday pages? I know it's stated that he "must eventually take all of the pages", but that's nonsense when it comes to infinity. He could just reply "Well I'm taking one non-birthday page every googol (or much more) years, so eventually I will". I suppose the fact that I immediately think that is due to me having little trouble in grasping this paradox; he is able to defer having non-birthdays, effectively, for an infinite amount of time as a legitimate ratio.
FUN FACTS: The Julian calendar (Old Style) had years consisting of an average 365.25 days. According to this calendar, every century year is a leap year (they are all divisible by four). But in the Gregorian (New Style) calendar, only those century years that are divisible by 400 are counted as leap years. This means that in a 400-year cycle, there will be 97 leap years instead of 100.
This makes the average year 365.2425 days. In addition, it is 52.1775 weeks (26.08875 fortnights), and the average month is 30.436875 days.
As compared to the Julian calendar: 365.25 days, 52 and 5/28 weeks (52.17[857142857142]...), 26 and 5/56 fortnights (26.089[285714285714]...), and an average month of 30.4375 days.
However, the average year is still a Julian one for all sets of years from xx01 to the next xx96. And in particular, more practically, the set of years from 1901 to 2096 still has an average 365.25 days.
Update: I was off with my first post, except for the actions of the lowest 202 pirates were they to remain. Your solution is absolutely correct to the smallest detail, and my numbering scheme was just the reverse of yours. The very first pirate, incidentally, loses his head with 136 votes out of 300 (the 35 below him attempting to save themselves by supporting him).
Just to make the reasoning for this explicit, I will continue numbering the oldest first and explain the entire process.
Down to just 299 and 300, 299 would take everything.
298 will offer 300 1 coin and 299 nothing.
297 will offer 299 1 coin and the others nothing.
296 will offer 298 and 300 1 coin.
295 will offer 297 and 299 1 coin.
And so on. In general, odd numbers will give 1 coin to all odd numbers and even numbers will give 1 coin to all even numbers, until they run out of coins.
The pattern does indeed become interesting after number 100 has to give away everything to keep his head. He survives, and so does 99, with 101 votes out of 201 and 202. 99 pays 100, but besides that the pattern is the same.
98 is killed paying 99 and even numbers because 101 votes is no longer enough. This means that the pirates need votes for free. As I overlooked originally, they will get free votes because those who know they will otherwise die will vote for them.
For example, 98 votes for 97 without being paid, and 97 survives with 102 votes out of 204 (97 pays who 98 would have).
96, 95 and 94 are killed, but with their support 93 makes 104 votes out of 208 (he pays 97, 98, 101 and odd numbers to 295).
92 to 86 are all killed, but with their support 85 makes 108 votes out of 216 (he pays 93 to 96, 99 and even numbers to 288).
84 to 70 are all killed, but with their support 69 makes 116 out of 232 votes (he pays 85 to 92, 97, 98, 101 and odd numbers to 279).
68 to 38 are all killed, and 37 is the final survivor who with their support makes 132 votes out of 264, and pays 69 to 84, 93 to 96, 99 and even numbers to 258.
I think we see that the pattern is that the number of free votes required keeps doubling. First 1, then 2, then 4, then 8, then 16, then 32 votes were required after the 100 coins were spent. It is clear enough that if there had been at least 28 more pirates, number 1 could have made a 164 out of 328 majority, confirming this pattern.
I solved this problem, but I'm afraid I don't even understand your answer. The solution I arrived at is that, where the highest ranking (oldest) pirate is #1 and the lowest #300, every pirate from #1 to #98 will be killed in a futile attempt to satisfy the masses with overstretched coin (they offer one coin to every pirate above them up to #100, and then every odd number, if they are odd, or even number, if they are even, as per your rule of paying the highest-ranking). They get out-voted by the masses of lower-ranked pirates who are trying to jump the queue. But #99 is the first pirate who can pay a majority (101 out of 202 including himself), but he has to give away every single coin to #100 and all the odd numbers up to #297 so they will spare his life.
Edit: I think I messed up the above analysis. I forgot to take into account that the pirates prefer to stay alive, and those that would be killed, beginning with #98, would therefore vote for those above them. I am redoing the analysis now, and will also assume that the pirates would prefer to throw a higher-ranking pirate overboard, if a lower-ranking one will offer him the same deal.
It's actually, ironically, because being rational can make you vulnerable. In real life, you won't offer somebody one coin because they will throw it in your face. But a rational money-maker doesn't necessarily care about fairness or principle or vengeance; he has to take whatever he's offered if the offer is final, and everybody knows it. So people will give him the absolute minimum, and he will take it.
The pirates would like to make a contract or a threat by saying that they will not accept any offer below, say, 15 coins, but there is just no way to do this convincingly. A totally rational player doesn't have something that you or I might have in this situation: reputation. They are predictable and can't bluff their way into getting more than 1 or 2 coins. When the offer stands, they have no choice. (But then, even if they could make a contract, they would all demand 50 coins because A only needs two votes. And then A would want to make a contract saying he won't pay above 1 or 2 coins. And the others would want to promise more to get him thrown overboard and so on. Cooperative outcomes can be extremely difficult in competitive situations.)
I think you are correct that what E does will depend on the interpretation of the question, and I also think that crandles is right that what D would do in a three-pirate arrangement will as well. I'm not satisfied with the answer that 0 votes out of 0 qualifies as 50% or more in favour, but unless we argue that the percentage is supposed to include the oldest pirate (note that C would have to buy D and E) I guess that's a philosophical problem. Nevertheless, I think that, interpreted literally, you are correct. That's a nice catch!
I suppose I will now just detail the several possible solutions.
As has been pointed out, there are several assumptions that must be made before a solution can be given. We must decide, firstly, whether E would keep everything or be condemned to suicide if he was the last remaining. We also must decide whether the pirates are still a 'bloodthirsty bunch' as they were in the last version, and prefer to throw each other overboard, all else being equal. Finally, if they are, we must decide whether this tendency is greater than their desire to stay alive (although I think it's clear enough that it is). There are several alternatives, then:
The one that seems most consistent with the proposed answer is that the pirates are not greedy, and E keeps the bounty. This would proceed as follows:
E would throw D overboard unless D offers 0:100. E accepts this offer.
C buys D for 1 coin.
B buys E for 1 and D for 2 coins.
A buys C for 1 and E for 2 coins. 97:0:1:0:2
The losers reject the offers in vain.
If E keeps the bounty but the pirates are greedy:
D offers E everything. E throws him over anyway.
C keeps everything. He gets the vote of D because D does not want to die.
B buys D and E for 1 coin each.
A buys C for 1 and D or E for 2 coins. 97:0:1:2:0 or 97:0:1:0:2
If E loses his life at the end, which I think is most faithful to the literal text:
D keeps everything because E does not want to die.
C buys E for 1 coin.
B buy D for 1 and E for 2 coins.
A buys C for 1 and D for 2 coins. 97:0:1:2:0
I want to see what would happen if the percentage did include the oldest pirate. After all, it did in the first version with identical wording.
E keeps bounty, pirates not greedy:
E accepts 0:100.
C offers E everything as well. D votes because he is not greedy.
B buys C and D for 1 coin each.
A buys E for 1 and C and D for 2 coins each. 95:0:2:2:1
E keeps bounty, pirates greedy:
E throws D overboard no matter what.
C offers E everything and D accepts because he does not want to die.
As above. 95:0:2:2:1
E dies if alone:
D keeps everything.
C gives D everything and E accepts to survive.
B buys C and E for 1 coin each.
A buys D for 1 and C and E for 2 coins each. 95:0:2:1:2