# 18.6 Different Prices for Buying and Selling

Up until now, we have considered the case in which an agent can buy or sell goods at a constant price; and in particular, we’ve been assuming that the price they could get by selling some of their own endowment is the same as the price they’d have to pay for additional units of that same good.

However, there are lots of economic applications in which the price you get when you go to sell something is (perhaps much) less than the price you’d have to pay for more of it. This can happen, for example, if there are transaction fees: for example, if you try to sell a ticket to an event on StubHub or TicketMaster, the platform itself takes a cut of the deal; so the seller gets a lower price than the buyer pays.

How does this play out in this model? Let’s investigate the example of buying and selling tickets, so let’s have “good 1” be tickets and “good 2” be money. Let’s say you are a season ticket holder to a sports team, so you have 40 tickets; let’s also assume you have €1200 in the bank. Therefore your endowment is $(40\text{ tickets}, €1200)$. You can choose to sell your tickets online for $p^\text{sell} = €25$ each, but it would cost you $p^\text{buy} = €60$ per ticket to buy additional tickets. How can we construct your budget constraint?

- First of all, you don’t have to buy or sell any tickets, so your endowment point $(40, 1200)$ must lie on your budget constraint.
- If you sold all your tickets, you’d get $25 \times 40 = 1000$ more dollars, bringing your total money to $€1200 + €1000 = €2200$; so $(0, 2200)$ is the vertical intercept of your budget constraint.
- If you spent all your money buying additional tickets, you could buy $1200 / 60 = 20$ more tickets, for a total of $40 + 20 = 60$ tickets; so you can also consume at the point $(60, 0)$.

Plotting these all together, we can see a kinked budget set:

What should you do? It depends on your utility function. Here, instead of just using the utility function $u(x_1,x_2) = x_1x_2$, let’s use the more general Cobb-Douglas utility function $u(x_1,x_2) = x_1^\alpha x_2^{1-\alpha}$. This has an MRS of \(MRS(x_1,x_2) = {\alpha \over 1 - \alpha}{x_2 \over x_1}\) So at your endowment of $(40, 1200)$ your MRS is \(MRS(40, 1200) = {\alpha \over 1 - \alpha}{1200 \over 40} = {\alpha \over 1 - \alpha} \times 30 \text{ dollars per ticket}\) In other words, if the price you could get from selling a ticket were greater than that, you’d want to sell some of your tickets; and if the price you had to pay to buy more tickets were less than that, you’d want to buy more tickets. In other words, the MRS at your endowment determines your cutoff price for buying and selling tickets.

However, what happens if the price you could sell tickets for is below your cutoff price, while the price you could buy them for is above your cutoff price? Then you might optimally neither sell tickets for a low price, nor buy additional tickets for a high price. This is illustrated in the case below. You can see that when $\alpha = 0.50$, your cutoff price is $p = 30$; so you are neither willing to sell tickets for $$25$, nor buy additional tickets at $$50$:

Try playing around with the graph to determine how your behavior would change:

- if $\alpha$ increases or decreases
- if your intial endowment of tickets, $e_1$, increases or decreases
- if your initial endowment of money, $e_2$, increases or decreases
**extra challenge:**if you could*sell*tickets for a high price or*buy*them for a low price (but not both)

If you can answer those questions *intuitively* and demonstrate why they’re true *mathematically*, you’ve really internalized the main points of this chapter.