Data and Shipping

Financial Markets

By Basil Karatzas 2015-03-03 14:21:31

The state of the dry bulk market has been making big news to the extent that mainstream business publications have started covering the market and offering pontifications on what may have gone wrong; just a year ago, the mood was much more constructive and effectively people in the market voted with their wallets believing that the darkest days were behind us.

China has taken some blame for their unexpected slowdown, which has been negatively affecting shipping. Miners and captive cargo interests have been blamed for building up their own fleets and thus shaving the ‘peaks off the market’. Shipowners have been blamed for overbuilding (again), thus giving substance to the adage that ‘shipowners are their own worst enemies’. And institutional investors and the famous private equity funds are taking some of the blame for supporting the oversupply wave as plenty of money has been invested directly with the shipyards or through strategic partners.

Probably blame has to be shared among several variables, as usually no one cause is ever the absolute factor for a market shift. We suspect, however, that another cause for the present state of the market is the way markets and trades have been modeled and analyzed and have been expected to behave. In the aftermath of subprime crisis, it has been shown that residential real estate data in the US going back for a century were limited to certain areas of New England; that, however, didn't stop analysts from extrapolating the data for the whole US, east to west and north to south, agrarian, urban, suburban, industrial and tourist destinations, all in one single model and providing credit instruments on effectively unrelated asset classes under the same model.

Even for a novice, it’s not too difficult to see that a mortgaged farm house in Iowa has little to do with a mortgaged ocean front property in La Jolla, California, besides both being administered under US bankruptcy law. However, the need for data to analyze and model led to collecting and extrapolating wrong data (or limited data) and too many assumptions were made in order to build a business model. Likewise, the human factor was minimized as this is not easily quantifiable or easily correlated with market performance; one can see the human factor typically with stampedes and massive moves – which often are associated with black swan events – but otherwise are related to the footnote of the model.

How come the present state of the dry bulk market caught too many people by surprise, including institutional investors who are seasoned in the game of predicting the market and turnaround situations, investors who typically are known to be the smartest people in the room?

A lot of decision making has been dependent upon modeling, typically by young analysts with little experience in the maritime industry and insufficient understanding of the ‘soft’ issues in shipping. Typically, models for shipping projects are transplanted from other industries, and they are expected to work in shipping – since they have worked out in the past in other industries. Collecting and accessing detailed data in shipping is hard to do, and there are a couple of well-known databases that are providing asset prices, freight rates, etc. Such databases have become the ‘bible’ for many an analyst; shipping projects where databases do not provide 20-year time series are automatically rejected, even if it’s about the sale of gold bullion at discount price; if there is no long term data, sorry, no deal – indeed.

By default, most investors populate the asset classes with most historical data, thus certain asset classes/segment projects are becoming more competitive since funds now have the data ‘to get their hands around’ them. Regrettably, data is not always transparent and universal or of a liquid state; several reports, for instance, report one-year, three-year and five-year time charters in certain asset classes; as anyone who has worked a single day at a charter desk can attest, long-term charter markets are usually illiquid, often extremely illiquid and quite often this rate is the broker’s guestimate of the five-year charter – since there is no active market. This is the most extreme and obvious example where data is created out of thin air and investment decisions are made on data made of thin air. And, quite frankly, there have been projects that have come out of thin air in shipping just because of this. Go figure!

Using market data in a superficial way often can be hilarious, or frustrating, but most unprofessional when done by professional investors and money managers. We were working with an institutional investor who, after long hours of ‘educating’, opted to buy two handysize bulkers from a Chinese shipyard at $16 million each at a time when a well-known industry database was quoting such vessels at $22 million. This was the benchmark for their decision making, and they managed to buy at distressed price, as per the fund’s masthead on their website. How could they be wrong?

For starters, those vessels were in the market for sale for more than six months and there were no takers; for people long enough in shipping, the Chicago School of Economics efficient market model is a theorem derived of experience: if you see a $100 billion in the street, likely it’s false as the market is efficient and someone else should have found it earlier; good-priced vessels lingering in the market for six months never happens, even in bad markets. Then, these handysize vessels were the first vessels ever built by this Chinese shipbuilder (‘greenfield yard’) on their own account; again, the fund was happy, since the vessels were properly classed by an independent third body, the Chinese Classification Society. In our professional experience, we know that several millions of dollars are required to bring ‘greenfield yard’ projects to par, if that is doable at all – since a great deal of such tonnage will be looking for better days at a ship-breaker’s yard – that’s the sad, sorry truth. The morale of the story is that the institutional investor used a benchmark price to make an investment decision; although they obeyed their mandate to buy ‘distressed’ (or value, as it is advertised), they used the benchmark wrongly; for most people in shipping that was a sucker’s investment. And do not let the model fool you for a second.

Data providers are building up their databases with additional information and time series, including discounted cash flows, algorithmic vessel valuations, market projections, etc. While in the past operating shipowners were controlling the market, now financial investors are becoming more crucial, and thus the need to feed them with more data. We are very fond of more data and ‘big data’, but on the other hand, we are also suspicious of providing cookie-cutter models where there is little ‘checking under the hood’. It’s not that the data is wrong, but always accurate data in shipping is hard to get on timely bases and often there is the need for due diligence in qualifying the data – we see it every week in the sale and purchase market with reported sales and reported prices: not always what meets the eye. Giving an analyst data, a model and assumptions to run, often all these provided under a sell-side business model, is a scary combination. Projects are qualified or disqualified for the wrong benchmarking, and the market is getting distorted for the wrong reasons. 

We are not sure that the present shape of the market is caused by wrong benchmarking, but when the so-called ‘masters of the universe’ have a model even for the wind, there probably has been a butterfly effect somewhere that was missed.

But again, there will be more articles similar to ‘The Expensive Shipping News for Wall Street’s Smart Money’ recently published in the Financial Times.

Who said that shipping is an exact science?

The opinions expressed herein are the author's and not necessarily those of The Maritime Executive.