Please fork it for us lazy ones
If you are on an old version of iOS and mourn the loss of DarkSky app, I personally like MyRadar - it has some very nice features and I used in tandem with DarkSky in the past.
If you are after writing an iOS app, you already have an Apple developer account allowing for 500k calls/month.
If you are after writing an iOS app and want to use DarkSky rather than WeatherKit, Pirate Weather should also be a drop in replacement.
This answers the first question I had upon seeing the name, which was, is it free, open, documented and legal? Based on the above the answer seems to be "probably".
Very commendable effort, and I hope the project can last. It seems to be very difficult to maintain a free and reliable weather API so hopefully the dev is not biting off more than he can chew.
I think it says something that we live in a world where NOAA data could be seen as an underground or less than legal means of getting the weather.
Perhaps AccuWeather has been successful in their campaign to keep free weather data as obscure as possible.
I'm imagining designing a software product around this and presenting it to a C-Level, explaining that we use "PirateWeather" and I think I'm going to get grilled with lots of questions and concerns based on the name alone.
This is a good service and should be "branded" with a better name. Maybe a play on the whole DarkSky name like LightSky or "Sunset" which works exceptionally well since DarkSky was sunset by Apple. Maybe StarrySky, LateSky, NewSky.
I am usually someone who says that names don't matter as much as people think they do, but PirateWeather just seems like a huge hit in the wrong direction. But the product is solid so maybe it can survive despite the name.
More likely
1. they will not ask
2. If they ask they will care
3. If they care they will not with a short explanation
this seem like a non-issue, and if the biggest thing you have to worry about is some C Level with a stick up their ..... well I think you have nothing to worry about then
the name is fine...
Maybe we're on different webs, but in my mind "pirate" has good connotations. As in, a rebel, a free spirit, a fighter of oppression.
That depends on who you ask... People who are fans of, e.g., ThePirateBay probably would disagree with you.
Also, there's a shipping service called PirateShip that's totally legal (it's basically a frontend to USPS shipping labels). The website is pretty amusing with the "Arr, matey!" stuff.
I’m a massive fan of - and indeed contribute data to - the weather underground project, but the naming has always made me a little uncomfortable.
Alternatively, imagine what the world would be like if we spent less time thinking about what C-suite executives think.
Why do you draw that conclusion? It seemed to be making a joke about the fact that APIs are NOT copyrightable and considered fair use. The site is pretty clearly fully legal.
I use Willy weather that uses bom. I don’t love it though. Any recommendations? I personally can’t stand the actual BoM app as an iPhone user but I appreciate that they tried…
R.I.P Pocket Weather :(
Have you tried the official iOS BOM app in the past few weeks? I recently switched to it from Willy Weather as it has improved a lot. It's display of the next 90mins of modelled rain radar is the Official App's killer feature for me.
Last week it enabled me to drive 40mins to some mountain bike tracks, knowing rain models showed that the massive storm would have just passed by the time I'd get there.
This is a timely post and comment as I’ve been thinking of solving a weather problem I have with that feed. Hoping the hn hive brain might have seen the same.
Came here, like I come to every weather API/tool discussion, to ask the same thing ... I would really like a less spammy, less bloated 10-day forecast a la WU.
This comes close, however - there's a nice 7-day lookahead ... is it possible that WU is just fudging days 8-9-10 and no real data is available beyond 7 days ?
Another method that's occasionally used is to just fill in with TMY (Typical Meteorological Year) data. Lots of those data sets are freely available, or if not, are very inexpensive to calculate if station data is available.
If you're looking for a minimally spammy, information dense forecast and you're in the US, it's pretty hard to beat weather.gov. (And make sure to occasionally read the zone and regional forecast discussion texts, too. They're really interesting and often educational!)
A while back I installed quite a few weather apps on Android, at the time, I felt wX and QuickWeather (from F-Droid) were the best. Maybe Flowx will be even better :-)
Link: https://weathergraph.app
Screenshots: https://impresskit.net/6430c7f0-b34b-418f-9824-f386f939be9a/...
What do you think?
Anyhow, here's the url:
For example: https://www.yr.no/en/details/table/2-4407066/United%20States...
You will see not only the cloud cover as an overall percentage, but also the different levels. A 50 in the middle is very different than 50 in low in terms of what you can expect that day (for photographs).
https://www.weather.gov/documentation/services-web-api
Or is this a friendlier overlay of their interface?
> All weather data comes from the AWS open data program https://registry.opendata.aws/collab/noaa/. This is a fantastic program, since it is more reliable than the NOAA distribution, and means there are no data transfer changes!
In summary: horrible oversight by the federal govt (read, congress) of our technical/scientific forecasting resources means that our forecasting ability is extremely fragmented and poorly organized. This has lead to a lot of companies being essentially resellers of public data. These companies claim to create a lot of value added products ('cleaner APIs', 'minutecasts', etc etc) that are either scientifically dubious or technically simple and then these companies walk away with huge profits based on being a portal to government data.
It's so American it is almost laughable, all while the European ECMWF eats our lunch in terms of accuracy even for the CONUS. I've discussed this on technical internet forums often enough that I can practically already write the replies to my own comment. "What's the problem with that?" etc et al. But the reality of it is that it's emblematic of how politically broken the US is, in particular with regards to the agencies in charge of scientific products and funding. Not to mention the concrete problems with the forecast products themselves.
Anyway. Good luck pirate weather and godspeed. Information was meant to be free and open, especially the forecast. It's such a laughably simple problem that could/should be so easily solved but, alas, there is money to be made!
There is enormous data available at https://www.weather.gov/ at no charge, including hourly and weekly forecasts, spot forecasts, radar (multiple layers, with/without animation) and satellite (multiple layers, with/without animation), plus storm watches, hurricane info, historical data, climate data…
I guess it's nice that apps can do things like advise me that it might start raining in a few minutes, but often by the time I see those alerts, the water on my head has alerted me anyway.
All other weather apps, it seems to me, are for little more than tracking my location and serving me relevant ads.
Companies can add value by providing documented, consistent API's for data that, yes, is free from the government. NWS does not face the market incentives of a company, and it shows.
And yes, a bunch of companies take that data and hype it, but that's not particularly new - it precedes apps. I've long seen claims of forecast accuracy from private companies that are, well, absurd, given the limits chaos (and other issues) place on forecasting very far into the future.
The Big Data Project is a substantial improvement in terms of access, but the data itself is still in the legacy formats. Also, some data is not well suited for mobile access - it's in giant binary blobs (NEXRAD Level II) or requires multiple HTTP operations to acquire.
But... at least it's available and free (except for lighting).
I'm not talking about model data - I let others worry about that, and for personal use, I use sites like the excellent one from College of DuPage ( https://weather.cod.edu/forecast/ ). I've watched friends in the research and operational community describe their frustration at the decision process that went into GFS modernization, and how it was frustrating to see ECMWF beat it out in forecast skill (I'm not up to date on where that stands now).
The diagnosis of the problems of the American forecast modeling community here is based on flawed premises. There are three major factors which led to the ECMWF leap-frogging the US in day-ahead forecasting capability. The first is the consolidated investment in supercomputing resources; the WRFIA tackles this by earmarking a much larger allocation of funding for NOAA's next gen supercomputer, but this still pales in comparison to ECMWF investments.
The second factor is the fragmentation of the research and operational weather modeling communities due to the divergent investment from NOAA and USAF in the 90's and 2000's; USAF in conjunction with NCAR sponsored the development of the WRF model which was widely adopted by the research community. NOAA continued investing the GFS lineage of models. The bifurcation of these communities slowed down the ability to matriculate advances in model developments to operations, and this was exacerbated by an old, closed-off approach by NOAA which made it extraordinarily difficult to run and develop the GFS on anything other than NOAA hardware.
Finally, the ECMWF went all-in on 4DVAR data assimilation in the late 90's, whereas the American community pursued a diversity of other approaches ranging from 4DVAR to ensemble Kalman filters. 4DVAR necessitates advances to core weather model software (e.g. you need to write a model's adjoint or its tangent linear in order to actually use 4DVAR) adn the US' failure to adopt it led, imo to a "double edged sword" effect of (a) failing to provide impetus to greatly improve the US modeling software suite and supporting tools, and (b) being a worse assimilation technique unless advanced EnsKF techniques are employed using very large ensembles of models (expensive).
The other problem as others have pointed out is that there is no accountability in the US private sector weather market. Virtually every player is re-transmitting raw model guidance or statistical post-processed forecasts using DiCast, _maybe_ with a some manual tweaking of forecast fields by humans. But this is not transparent, and many companies - if we're being charitable, here - are not honest about what they're actually doing to produce their forecasts. Put another way - there's a lot of BS claims out there, and it seems that investors have been more than happy to fund it over the past few years.
I'm wondering how much your monthly bill is?
The monthly bill isn't great, but has been manageable through donations so far. I'm handling about 10 million API calls/ month, so lots of Lambda invocations, but they're all thankfully very short with tiny outbound data transfer totals, and luckily inbound is free. If the costs ever get out of hand is to throttle the request rate down, but there's still a few optimizations I can do on the AWS side that should help (ARM here I come!).
"Lock-in" is just a trade-off like any other engineering decision, same as what programming language or API schema you choose. AWS is massive and reliable, and these services are cheap and widely available so I don't see much of an issue here.
Put another way, how would you implement this without self-managing pieces like nginx/apache, rabbitmq/kafka, or mongodb on a compute instance?
The pieces of managed infra they're depending on are all pretty interchangeable from one cloud provider to another. It's not the cheapest way to solve the problem if you discount the cost of management, upgrades, etc. But if you factor those costs in, it's quite competitive.
Certainly, I wouldn't get any vendor-specific services like Lambda or SNS (in this example)
AWS diagrams are pretty intimidating until you've built a few things with several AWS services.
The benefit is that a lot of maintenance work is taken care of for you, and your costs can be low if you don't need a lot of compute.
This diagram is actually pretty simple. It looks worse than it is. All it uses are Lambdas (serverless functions), S3 buckets (object storage), and SNS (broadcast/push queues). There appears to be one traditional server in there with EFS, which is just an elastic file system.
All of these systems have equivalents in all the major cloud providers. So if the builder of this wanted to move to GCP or Azure, they are not really locked to AWS. This can all be built in another cloud.
Now, could you do it in a day? No. Assuming they are building it with Infrastructure as Code (such as Terraform) then they would need to convert the provider and change resource blocks. But this akin to refactoring in a codebase. Its work, but its not terribly difficult. Then they point it to their new cloud and run `terraform apply`.
There is almost no way to entirely remove vendor lock-in. The closest you could come is by designing everything yourself on bare metal servers and renting those from a cloud provider. So instead of using a managed queue system, you run some sort of messaging queue on the server. Then you host files on the server's filesystem, and you run the "lambdas" as applications on the server. But that almost causes more headaches than you save or solve for.
I look at Cloud Providers as similar to cell phone providers. I know people who live in fear of being locked into a contract with Verizon or something. But really, what are you going to do? You will always need a cell phone. The only other real choice is AT&T or maybe Sprint/TMobile. How often are you really going to switch and what are you really gaining by doing so? Energy spent worrying about being "locked in" to a cloud vendor is energy wasted. Yeah you can move from AWS to Azure or GCP. But that's about it. What do you gain by switching? Probably almost nothing. They are all pretty comparable at this point in reliability, features, and price (GCP is the slight laggard here, but not by much). If Google calls your company and offers you a huge discount to switch, you could still do it. Aside from that, there's minimal incentive to do so.
There are a few weird services that AWS has for example that might be considered "lock-in" services. This would be things like AWS Snowball or AWS Groundstation. These don't have comparable systems on other platforms. In the case of Snowball you probably have so much data on AWS that just transferring data would take months (or even years) which could be considered a form of lock-in.
tl;dr - This is a very tame arch diagram. A few lambdas, s3 buckets, and messaging queues, all of which have comparable services on all major clouds. There isn't significant vendor lock-in, this could be rebuilt fairly easily (assuming they used IaC) on any major cloud provider.
> This diagram is actually pretty simple
The diagram looks like an ad
> All it uses are Lambdas (serverless functions), S3 buckets (object storage), and SNS (broadcast/push queues)
Do you actually need all of this or do you use it because Amazon tells you to? I know for instance you cannot use Amazon SES without also using S3 and Lambda
> So if the builder of this wanted to move to GCP or Azure, they are not really locked to AWS. This can all be built in another cloud
You're saying that I cannot move to other cloud provider without my existing code becoming useless?
> Assuming they are building it with Infrastructure as Code (such as Terraform) then they would need to convert the provider and change resource blocks
What about the data pipelines and business logic?
> There is almost no way to entirely remove vendor lock-in
There is: avoiding vendor-specific APIs altogether
> Closest you could come is by designing everything yourself on bare metal servers and renting those from a cloud provider
I don't have to. There are things like Railway, Fly.io, PlanetScale, Supabase, Upstash, Minio, which can work without locking me in
> What do you gain by switching?
Freedom
> There isn't significant vendor lock-in, this could be rebuilt fairly easily (assuming they used IaC) on any major cloud provider
You are contradicting yourself
These types of projects are great for stuff like home automation. I’m using to to improve my predictions for power generation (PV) and consumption (heat pump). Planning to is ergst to optimize home battery charging in the future.
(Disclaimer; open sourced a small go library for open meteo, but otherwise not affiliated)
In contrast to pirate weather, I am using compressed local files to more easily run API nodes, without getting a huge AWS bill. Compression is especially important for large historical weather datasets like ERA5 or the 10 km version ERA5-Land.
Let me know if you have any questions!
I've got a bunch of moving parts in my system, including realtime (5-minutely) energy pricing. If it looks like it's going to be cloudy tomorrow I put my thumb on the scales to make it more likely that my system will buy power from the grid to top off the battery so I can ride through any price spikes.
I don't have the stats chops to determine whether I'm actually saving any money with this approach, but it sure is a lot of fun.
1. https://forecast.weather.gov/MapClick.php?lat=37.78&lon=-122...
2. https://forecast.weather.gov/MapClick.php?lat=37.98&lon=-120...
FWIW, they also have a pretty decent API. Its based around zones though, which you'd need to look up. So from that lat and lon, you'd get the zone from:
https://api.weather.gov/points/37.7827,-120.38
Using the zone information, you can get to a forecast:
https://api.weather.gov/gridpoints/STO/72,24/forecast
If you're wanting the current observations, you'd pick a station for that grid such as MOUC1 and go to its observations/latest endpoint:
https://api.weather.gov/stations/MOUC1/observations/latest
The API is in a JSON-LD format so its got a lot of links to related topics in the actual JSON payload. Looking at the JSON can make it somewhat easy to feel out what you need. The documentation is here:
Initially I had few bookmarks saved on my android home screen, for each city I was interested in weather forecast of. But soon there were total 4 cities; & their text size is too small, need to zoom. I slurped their json api & wrote this:
You need to add URL for each city you want to see weather, in Settings. All that data is saved in your device local storage. No funny trackers or home phone.
You can also paste the following json in Settings > Text Area > Click Import JSON:
[
{
"abb": "STK",
"full": "Stockton",
"url": "https://forecast.weather.gov/MapClick.php?lat=37.95&lon=-121.29&unit=0&lg=english&FcstType=json&TextType=1"
},
{
"abb": "SFO",
"full": "San Francisco",
"url": "https://forecast.weather.gov/MapClick.php?lat=37.7771&lon=-122.4197&unit=0&lg=english&FcstType=json&TextType=1"
}
]https://developer.apple.com/documentation/weatherkitrestapi
I created the start of a Python wrapper for WeatherKit, if anyone is interested in helping with that effort:
Right now there are thousands of weather stations posting regular reports via APRS.
Any chance to see those integrated?
I am in Europe it was completely off in both its 24-hour forecast and the actual real-time weather. It indicated a continuous heavy snowfall whereby in reality the sky was just lightly clouded with no precipitation. Just 2c.
It seems that they only use NOAA data even though there are vastly superior models for the EU, e.g. ICON-EU and ECMWF.
Just bought you a coffee, everyone who upvoted this should chip in!
Would be cool if Pirate Weather could serve as the foundation for such a thing.
> Exceeded daily email limit.
Happens when I try to signup
Emails to them resulted in replies asking to instead use their FTP.
Additionally they're one of the few Australian Gov agencies that still don't have proper https deployment yet.