2) Custom silicon. Open source tools & decentralization of fab tech (by countries not wanting to be subject to international trade problems... as well as Moore’s Law slowing) are gonna make this like making a PCB.
3) Electric airplanes. With wings. “Drones” as VTVL electric helicopters have been big for a decade, but there are physical limitations to that approach. if you want to see the future, look at drone delivery in Rwanda by Zipline. But also, I think we’ll see passenger electric flight start small and from underserved regional airports sooner than you think, doing routes and speeds comparable to High Speed Rail (and with comparable efficiency) but much more granular and without massive infrastructure.
4) Small tunnels by The Boring Company and probably competitor companies with similar ideas. When you can build tunnels for one or two orders of magnitude cheaper than today, cheaper than an urban surface lane, then why NOT put it underground? And they’ll be for pedestrians, bikes, and small subways like the London Underground uses. Gets lot of hate from the Twitter Urbanist crowd, but what TBC has done for tunneling cost (on the order of $10 million per mile) is insanely important for how we think about infrastructure in the US.
5) Reusable launch technology. The world still doesn’t grok this. They will. It (long term) enables access to orbit comparable to really long distance airfare or air freight.
6) Geoengineering. I’m not happy about it, but it’s insanely cheap so... it’ll probably happen.
If you mean what I think you mean (tiny marks on everything that encode information to help computers figure out what they're looking at), I agree. In particular, I've long been waiting for someone in self-driving sphere to give up on trying to crack the problem with just imaging the world as it is. In a saner world, countries would already be standardizing machine-readable markers on roads and posts and traffic signs. I'm still hoping someone will wake up and make use of this "cheat code".
Assuming such tunneling is in fact practical, I vote for burying the vehicles and letting the pedestrians have the surface...
https://austingwalters.com/chromatags/
Imagine encoding virtual objects or NPCs into a fiducial without a database... basically a 3D model + actions into a piece of paper you can attach anywhere.
Didn’t Las Vegas 1.5 mi tunnel cost ~$50MM ? I don’t think 1.5mi of surface road (not considering right-of-way costs) costs that much.
I worked at ULA for a year about five years ago. At that time they were arguing that it wasn't going to be cost effective. Back then though there may have been one or two SpaceX landings.
since I left I haven't kept up with this debate at all. Do you know if ULA changed their stance after all the successful launches?
There are unsolved physical limitations to that with no solution in the near horizon AFAIK, the energy density of batteries is simply to low (energy per kg) for airplanes to be efficient.
> from underserved regional airports sooner than you think, doing routes and speeds comparable to High Speed Rail (and with comparable efficiency) but much more granular and without massive infrastructure
I didn't understand that, is there a limitation for using normal jet fuel planes from regional airports?
I know about libresilicon [1], are there any others in this space?
New software development is 99% aimless churn.
We don’t need more new tech. We need better applications of old tech. There is so much software that works perfectly fine already. What’s missing is connecting it to real world problems.
"Everything great was created in the '80s, and we've been rediscovering those things every ten years since."
I'm not firm on "the '80s" - maybe this stuff is older than I think - but I think the principle still holds. If it's a problem today, somebody probably thought about it before, and then others came around and wrapped things differently.
It's not BAD to wrap things differently, but the old stuff had more of the sharp corners sanded off, and sometimes we lose that battle-hardened aspect when we rewrite code.
Except for garbage collection/whatever is happening with memory safety today. That's the good stuff.
However usually these systems didn't take off because they were "before their time". There were cloud services in the 80s - but PCs got faster and cheaper than internet speeds could keep up. Client side apps looked better than cloud apps. Similarly modern data centers, and cloud computing primitives didn't exist so reliability was more miss than hit.
Now the economics have turned and people need data shared across multiple devices. Cloud services are the defacto method of developing applications.
I care a lot about companies that actually make something new or popularize something that already existed but didnt have widespread appeal.
And now, a period of reinvestment in the bottom layers and signs of a diasporic divergence emerging. Movements that are ideologically different from yesteryear's FOSS, and a tightening of SV's grip on events that increasingly causes sand to pour through, new purposings of old tech and roads previously untaken. It's like Alan Kay put it: The future is the past AND the present.
Like looms.
Robotics is actually generally pretty slow. Regular (serial) robot arms are usually significantly slower than a human arm. Some parallel robots (ie where the motors are mostly stationary and don’t have to be waved around by other motors), like a SCARA or Delta robot, can go about 2-3x the speed of a human, but the difference isn’t massive (60 vs 150 picks per minute?).
But looms are insane. Their task is simpler, but they can do over 2000 picks per second (!). The yarn in air jet looms can be moving over 200 mph. And even mechanical looms like Rapier looms or projectile looms are super fast. The mechanisms are also super advanced and hard to wrap your mind around. Centuries of optimization of the first really good industrial automation instance will do that, I suppose.
It makes me think we haven’t reached a completely flat plateau in mechanical development. Our robots today are actually pretty primitive compared to where they could... where they really should be. It also shows just how hard I think a lot of futurists have underestimated human mechanical capability. Human dexterity and force density is crazy impressive. Humans are actually super strong, fast, AND precise.
And hard automation like looms are also underestimated vs “robot arms.” Hard automation is so much more effective if you can do it. Just robot arms aren’t that great vs people
It's also appropiate to put this comment under some other mentioning old tech. Because robots have become steampunk. Very dear to first sci-fi writers, now they're démodé.
As soon as NLP gets another frog leap, we'll start seeing a comeback.
* Better in this case would be a fundamental design to prevent spoofing, provide S2S encryption maybe E2E encryption, fixing MIME typing issues, fixing Rich Text/HTML display, etc. Basically an actually good faith replacement of e-mail instead of a vendor co-opting.
"The best under-the-radar car? It's a horse and buggy I tells ya!" Every one of these posts on HN has to have a hot take that's contrarian.
I’m also genuinely excited that there is growing momentum away from software churn, because we’re not going to solve the complexity crisis with another framework.
Some form of tablets and smartphones were there years before the iPhone or iPad.
And sometimes the tech we need is "out there" and has been for a while, but just hasn't hit "critical mass" yet.
Take @kroltan's answer. I am also extremely bullish on RDF, Wikidata, and the like. But most of this stuff is pretty old now, especially in "Internet years". Which leads, of course, to the question of where the line is between incremental refinement of "old tech" and actual "new tech" as a discrete thing.
We need to find more/better ways of integrating people with tech.
Tech on its own is 10% of the solution. Integrating humans with technology is underrated.
(I say this as the owner of various enterprise SaaS businesses but I'm sure it applies in all aspects of software)
https://medium.com/@adamagb/nintendo-s-little-known-product-...
In some circles, you might even be accused of being a boomer for using SQL. I think a lot of developers are missing out on just how much runway you can get out of SQL and libraries like SQLite. You would also be missing out on one of the greatest breakthroughs in the history of computer science with regard to our ability to model problem domains and perform inhuman queries against them with millisecond execution times. But hey, maybe machine learning and mongodb are working for your shop.
The final thing a lot of people miss are old ideas. Put your entire application on a single server somewhere, and all of its dependencies live in the same box. Optimize the vertical before you rewrite for horizontal, because 99% of the time you will go bankrupt before you get as big as Netflix so it wont matter anyways. Plus, you would go bankrupt faster anyways by chasing delusions of web-scale grandeur when you could have had the MVP done 3 years ago with just a simple SQLite database back-end and a T3a.micro. More likely than not you would have discovered it was a bad idea to start with and could more quickly move on to the actual thing you should have been focusing on.
Emacs and org-mode (and many things GNU) have started to make more and more sense to me in this day and age.
I suppose you're writing this on a 1982 Commodore 64...
“Better application” + “old tech” ≡ “new tech”.
Group permission, PAM, SSO, etc. It's like these developers have never been exposed to Active Directory ever in their life...
Group permissions predate AD by decades, PAM by a few years. SSO in today's form is a web phenomenon, so a web-oriented solution makes more sense, and there is a lot of work being done in this direction.
So you get new grads that are re-inventing the wheel.
2. Semantic sysadmin to declare your intent in regards to your infrastructure no matter how it is implemented (i.e. with a standard specification, interoperability/migration becomes possible) https://ttm.sh/dVy.md
3. GUI/WebUI CMS for contributing to a shared versioned repository. Sort of what netlify is doing, but using a standard so you can use the client of your choice and we tech folks can hold onto our CLI while our less-techie friends can enjoy a great UI/UX for publishing articles to collective websites.
4. Structured shell for the masses. Powershell isn't the worst, but in my view nushell has a bright future ahead. For the people who don't need portability, it may well entirely replace bash, Python and perl for writing more maintainable and user-friendly shell scripts. https://nushell.sh/
5. A desktop environment toolkit that focuses on empowering to build more opinionated desktops while mutualizing the burden of maintenance of core components. Most desktop environments should have a common base/library (freedesktop?) where features/bugs can be dealt with once and for all and we don't have to reinvent the wheel every single time. Last week i learnt some DE folks want to fork the whole of GTK because it's becoming too opinionated for their usage, and GNOME is nowadays really bloated and buggy thanks to javascript hell. Can't we have a user-friendly desktop with solid foundations and customizability?
I'd love to get more of your thoughts around how PowerShell might be more useful for the kinds of scenarios you're thinking about. We see a lot of folks writing portable CI/CD build/test/deploy scripts for cross-platform apps (or to support cross-platform development), but we're always looking to lower the barrier of entry to get into PowerShell, as it can be quite jarring to someone who's used Bash their whole life (myself included).
Structured shells have so much potential outside of that, though. I find myself using PowerShell to "explore" REST APIs, and then it's easy to translate that into something scripted and portable. But I'd love to get to a place one day where we could treat arbitrary datasets like that, sort of like a generalized SQL/JSON/whatever REPL.
Plus, PS enables me to Google regex less :D
Microsoft has always had this problem, but with PowerShell -- which is supposed to be this unified interface to all things Microsoft -- it is glaringly obvious that teams at Microsoft do not talk to each other.
To this day, the ActiveDirectory commands throw exceptions instead of returning Errors. Are you not allowed to talk to them?
The Exchange "Set" commands, if failing to match the provided user name, helpfully overwrite the first 1,000 users instead because... admins don't need weekends, am I right? Who doesn't enjoy a disaster recovery instead of going to the beach?
I'm what you'd categorise as a power user of PS 5.1, having written many PS1 modules and several C# modules for customers to use at scale. I've barely touched PowerShell Core because support for it within Microsoft is more miss than hit.
For example, .NET Core has caused serious issues. PowerShell needs dynamic DLL loading to work, but .NET Core hasn't prioritised that, because web apps don't need it. The runtime introduced EXE-level flags that should have been DLL-level, making certain categories of PowerShell modules impossible to develop. I gave up. I no longer develop for PowerShell at all. It's just too hard.
It's nice that Out-GridView and Show-Command are back, but they launch under the shell window, which makes them hard to find at the best of times and very irritating when the shell is embedded (E.g.: in VS Code)
The Azure commandlets are generally a pain to work with, so I've switched to ARM Templates for most things because PowerShell resource provisioning scripts cannot be re-run, unlike scripts based on the "az" command line or templates. Graph is a monstrosity, and most of my customers are still using MSOnline and are firmly tied to PS 5.1 for the foreseeable future.
Heaven help you if you need to manage a full suite of Hybrid Office 365 backoffice applications. The connection time alone is a solid 2 minutes. Commands fail regularly due to network or throttling reasons, and scripts in general aren't retry-able as mentioned above. This is a usability disaster.
Last, but not least: Who thought it was a good idea to strip the help content out and force users to jump through hoops to install it? There ought to be a guild of programmers so people like him can be summarily ejected from it!
I’ve used powershell regularly since way back when (it was still called monad when I first tied it).
I’m extremely comfortable in the Windows environment but even yesterday I found it easiest to shell out to cmd.exe to pipe the output of git fast-export to stop powershell from messing with stdout (line feeds)
I really like the idea of a pipeline that can pass more than text streams but it absolutely has to be zero friction to pipe the output of jq, git (and awk, sed etc for oldies like me) without breaking things.
A couple of more specific points I'd like to add after experience writing non-trivial PS scripts:
- Tooling is still spotty. Last I used the VS Code extension, it was flaky and provided little in the way of formatting, autocomplete or linting. AIUI PowerShell scripts should be easier to statically analyze than an average bash script, so something as rigorous as ShellCheck would be nice to have too.
- Docs around .NET interop still appear to be few and far between. I recall having to do quite a bit of guesswork around type conversions, calling conventions and the like.
It's nice to see the docs have had a major overhaul since I last dug into them though :)
apt search powershell returns no meaningful result on Debian unstable. I think that's a big barrier to entry, at least for me and people who deploy using docker images based on Debian and Ubuntu.
Edit: unless you are also responsible for DSC, than I'll take it back. It's terrible.
You should! It was definitely a compliment.
> I'd love to get more of your thoughts
On a technical level, i would say PowerShell is a breakthrough because it democratized the concept of structured data REPL as a shell. This pattern was well-known to GNU (and other LISP) hackers but not very popular otherwise, so thank you very much for that. Despite that, having telemetry in a shell is a serious problem in my view. That, and other technical criticisms others have mentioned (see previous HN discussions about PowerShell) is why i don't use PowerShell more.
On a more meta level, i'd say the biggest missing feature of the software is self-organization (or democracy if you'd rather call it that). The idea is great but the realization is far from perfect. Like most products pushed by a company, PowerShell is being developed by a team who has their own agenda/way and does not take time/energy to gather community feedback on language design. I believe no single group of humans can figure out the best solutions for everyone else, and that's why community involvement/criticism is important. For this reason, despite being much more modest in the current implementation, i believe NuShell being the child of many minds has more potential to evolve into a more consistent and user-friendly design in the future.
Beyond that, i have a strong political criticism of Microsoft as part of the military industrial complex, as a long-standing enemy to free-software (still no Github or Microsoft XP source code in sight despite all the ongoing openwashing) and user-controled hardware (remember when MS tried to push for SecureBoot to not be removable in BIOS settings?), as an unfair commercial actor abusing its monopoly (forced sale of Windows with computers is NOT ok, and is by law illegal in many countries) and more generally as one among many corporations in this capitalist nightmare profiting from the misery of others and contributing its fair share the destruction of our environment.
This is not a personal criticism (i don't even know you yet! :)) so please don't take it personally. We all make questionable ethical choices at some point in life to make a living (myself included), and i'm no judge of any kind (i'll let you be your own judge if you let me be mine). In my personal reflection about my own life, I found some really good points in this talk by Nabil Hassein called "Computing, Climate Change, and All our Relationships", about the human/political consequences of our trade as global-north technologists. I strongly recommend anyone to watch it: https://nabilhassein.github.io/blog/computing-climate-change...
> how PowerShell might be more useful for the kinds of scenarios you're thinking about
I don't think i've seen any form of doctests in PowerShell. I think that would be a great addition for many people. A test suite in separate files is fine when you're cloning a repo, but scripts are great precisely because they're single files that can be passed around as needed.
> Structured shells have so much potential outside of that, though.
Indeed! If they're portable enough, have some notion of permissions/capabilities and have a good type system they'd make good candidates as scripting languages to embed in other applications because these applications usually expose structured data and some form of DSL, so having a whole shell ecosystem to develop/debug scripts would be amazing.
I sometimes wonder what a modern, lightweight and consistent Excel/PowerShell frankensteinish child would look like. Both tools are excellent for less experienced users and very functional from a language perspective. From a spreadsheet perspective, a structured shell would for example enable better integration with other data sources (at a cost of security/reproducibility but the tradeoff is worthwhile in many cases i think). From a structured shell perspective, having spreadsheet features to lay data around (for later reuse, instead of linear command history) and graph it easily would be priceless.
> I'd love to get to a place one day where we could treat arbitrary datasets like that, sort of like a generalized SQL/JSON/whatever REPL.
Well that's precisely what nushell's "from" command is doing, supporting CSV, JSON, YAML, and many more! https://www.nushell.sh/book/command_reference.html no SQL there yet ;-)
PS: I wish you the best and hope you can find some time to reflect on your role/status in this world. And i hope i don't sound too condescending, because if you'd asked me yesterday what i would tell a microsoft higher-up given the occasion, it would have been full of expletives :o... so here's me trying to be friendly and constructive as much as possible, hoping we can build a better future for the next generation. Long live the Commune (150th birthday this year)!
Admittedly, i have no idea why we even need to do that nowadays, but that seemed to work.
$ curl -v --head https://nushell.sh/
* Trying 162.255.119.254...
* TCP_NODELAY set
* Connection failed
* connect to 162.255.119.254 port 443 failed: Connection refused
$ curl -v --head https://www.nushell.sh/
* Trying 185.199.108.153...
* TCP_NODELAY set
* Connected to www.nushell.sh (185.199.108.153) port 443 (#0)
Most hosts will alias or redirect away the www subdomain, but that's just a convenience. Of course technically foo.com and www.foo.com can have different DNS entries.Did you have some specific tool in mind? Because I completely agree that this is a great way of working with content. We have been doing that for a couple of months with our own tool. It uses Git and stores content in a pretty-printed JSON file. Techies can update that directly and push it manually. Content editors can use our tool to edit and commit to Git with a simple Web UI. Would that go into a direction you were thinking of?
If NetlifyCMS was a robust library for abstracting over versioning system and static site generator to build WebUI/TUI/GUI clients with, that would fit what i have in mind. I don't know of any such program yet, please let me know if you find something :)
nullpointer is just a file upload system of which ttm.sh is an instance residing in the tildeverse. I sometimes use it to publish drafts to collect thoughts/feedbacks on ideas i have. I'm also part of the tildeverse. I reside on thunix.net and do sysadmin for fr.tild3.org. I'm often around on #thunix, #selfhosting (etc) on tilde.chat in case you're also around :)
> I am a lifelong sysadmin, and have thought about #2 frequently. I am thinking seriously about making it a research project.
I think a lot of us have been obsessed with this idea for a while, but nobody to my knowledge has done it yet. If you feel like exploring this idea, amazing! It is in my view a complex-yet-solvable problem that many projects have failed to deal with because they've been too focused on narrow use-cases and not on the broader conception of a standardized specification ecosystem for selfhosting. If you feel like exploring this idea collectively (for example by cooperating on a standard for which you would contribute a specific implementation), count me in. I think a lot of brilliant people will be glad to board the ship once it's sailing!
If you'd like to see where this idea has taken me so far, take a look at the joinjabber.org project. The entire infrastructure is described in a single, (hopefully) human-meaningful configuration file, with Ansible roles implementing it: https://codeberg.org/joinjabber/infra
Wish you the best, please keep me updated if you have more thoughts on this topic or would like to actively start a such project
I wonder if shell stuff would work better in a notebook like environment.
Edit: At least one exists: https://shellnotebook.com/
Really? Can you elaborate a bit on the why? As far as I can tell, GNS has been around as a proposal for years and has gained no traction.
I know for a fact the opposite is true for me. A simple shell syntax with an amazing documentation is all it takes for people to write useful scripts.
I'm confident i can teach basic programming to a total newbie using a structured shell in a few minutes. Explaining quirks of conditionals and loops in usual shells is trickier: should i use "-eq" or "=" or "=="? why am i iterating over a non-matching globset? etc.
> why not just learn bash?
I have a love-hate relationship with bash. It's good, but full of inconsistencies, and writing reliable, fail-safe scripts is really hard. I much prefer a language in which i'm less productive, but doesn't take me hours of debugging every time i'm reaching an edge case.
Also, bash has very little embedded tooling, compared to nushell. In many cases, you have to learn more tools (languages) like awk, jq. In nushell, such features are built-in.
> being installed on almost every Linux/Unix system
Well, bash is definitely very portable. But at this game, nothing can beat a standard POSIX /bin/sh. Who knows? It may outlive us all :)
People who a) are trying to escape from the insanity of traditional shells and use something that works with structured data, and b) want something other than PowerShell.
The way we treat allergies today, with Zyrtec and Claritin, is medieval medicine. It doesn't solve the underlying problem; it just tries to cover it up.
Allergy immunotherapy is the future. Most people don't realize that allergies are now a curable disease. In the future, taking Claritin for allergies is going to seem like taking Tylenol for an ear infection. Why would you treat the symptoms when you could just cure the disease?
I started Wyndly (https://www.wyndly.com) to bring immunotherapy for pollen, pets, and dust across the USA. But we'll expand into food allergies soon, too.
Besides the delivery vehicle, what are the differences between allergy shots and these droplets?
“Patients who have done [allergy drops] and finished a course are now free of taking any allergy medications and they don’t have symptoms anymore.” Dr. Sandra Lin, video interview, Sublingual Immunotherapy (SLIT) for Allergy Treatment: Johns Hopkins | Q&A, author of Efficacy and Safety of Subcutaneous and Sublingual Immunotherapy for Allergic Rhinoconjunctivitis and Asthma (2017)
Video: https://www.youtube.com/watch?v=dpWomI4iPLY Paper: https://pubmed.ncbi.nlm.nih.gov/28964530/
Aqueous allergy drops are both safe and effective for environmental allergies (aka allergic rhinitis): - This has been proven through 30 years of data published by leading allergists in key journals, and confirmed in 2011 independent Cochrane review Learn more: https://pubmed.ncbi.nlm.nih.gov/21154351/
Aqueous allergy drops have a better safety profile than allergy shots - Unlike shots, there has never been a documented fatality to allergy drops in 30 years of use - The risk of a systemic reaction is thought to be 1 per 100 million doses or 1 per 526,000 treatment years Learn more: https://pubmed.ncbi.nlm.nih.gov/22150126/
Allergy drops are equally as effective as allergy shots - They have widespread use in Europe (up to 80% of the immunotherapy market in some countries) - Comparison studies by leading allergists have shown both allergy shots and allergy drops to be effective but no clear superiority of one mode over the other. Learn more: https://pubmed.ncbi.nlm.nih.gov/23557834/ https://pubmed.ncbi.nlm.nih.gov/26853126/
Have any data to back up this substitute?
Also, Zyrtec and Claritin did nothing for me, I’m an Allegra guy
What my doctor told me was -- you're going to get increasing doses a few times a week, it will take a lot of time be hard, and at some point you'll bump into an adverse reaction while you're in the waiting room after the shot.
The medical system was kind of broken wrt to my plight.
After consulting a number of folks, I finally found EPD and went to treatment.
https://en.wikipedia.org/wiki/Enzyme_potentiated_desensitiza...
It was really helping, then they stopped offering it in my area. I was pretty bummed they did away with it, because it helped me without side effects. My symptoms decreased in severity and eventually I felt fine. Apparently it was from the UK and worked well there.
I hope I'm wrong!
Looks like the innovation here is the move the serum from intradermal to a liquid, oral treatment?
(source: I have had immunotherapy)
In SPARQL you write statements in the form
<thing> <relation> <thing>
But the cool part is that any of those three parts can be extracted, so you can ask things like "what are the cities in <country>", or "what country harbors the city <city>", but most importantly, "how does <city> relate to <country>".For example, if you wanted to find out all the historical monuments in state capitals of a country (using my home country as an example, also pseudocode for your time's sake):
fetch ?monumentName, ?cityName given
?monument "is called" ?monumentName.
?monument "is located within" ?city.
?city "is capital of" ?state.
?city "is called" ?cityName.
?city "is located within" "Brazil"."Too powerful" doesn't seem like a thing until you realize it undermines DBA's skill investments, means business level people have to learn something and solve their own problems instead of managing them, disrupts the analyst level conversations that exist in powerBI and excel, seems like an extravagent performance hit with an unclear value prop to devops people, and gives unmanageable godlike powers to the person who operates it. (this unmanagability aspect might be what holds graph products back too)
If you don't believe me, the list of companies who use them also get a rep for having uncanny powers because of their graphs, FB, twitter, palantir, uber, etc.
Using ML to parse and normalize data to fit categories in RDF graphs is singularity-level tech, imo and where that exists today, I'd bet it's mostly secret.
I'm interested in this field and find it fascinating but we're still in it's early dark ages.
when it comes to Knowledge representation and reasoning there's too much emphasis on the representation part and less on the reasoning part, but even this representation part is not a solved problem.
c1
{
// Marked base resource identifiers used for concatenation.
"resources" = [
&people:@"https://springfield.gov/people#"
&mp:@"https://mypredicates.org/"
&mo:@"https://myobjects.org/"
]
// Map-encoded relationships (the map is the subject)
$people:"homer_simpson" = {
/* $mp refers to @"https://mypredicates.org/""
* $mp:"wife" concatenates to @"https://mypredicates.org/wife"
*/
$mp:"wife" = $people:"marge_simpson"
// Multiple relationship objects
$mp:"regrets" = [
$firing
$forgotten_birthday
]
}
"relationship statements" = [
&marge_birthday:($people:"marge_simpson" $mp:"birthday" 1956-10-01)
&forgotten_birthday:($people:"homer_simpson" $mp:"forgot" $marge_birthday)
&firing:($people:"montgomery_burns" $mp:"fired" $people:"homer_simpson")
// Multiple relationship subjects
([$firing $forgotten_birthday] $mp:"contribute" $mo:"marital_strife")
]
}
RDF is gonna be so awesome when it finally hits the mainstream!I've even written an engine that takes triples and renders web apps.
This is effectively a todo MVC as triples:
var template = {
"predicates": [
"NewTodo leftOf insertButton",
"Todos below insertButton",
"Todos backedBy todos",
"Todos mappedTo todos",
"Todos key .description",
"Todos editable $item.description",
"insertButton on:click insert-new-item",
"insert-new-item 0.pushes {\"description\": \"$item.NewTodo.description\"}",
"insert-new-item 0.pushTo $item.todos",
"NewTodo backedBy NewTodo",
"NewTodo mappedTo editBox",
"NewTodo editable $item.description",
"NewTodo key .description"
],
"widgets": {
"todos": {
"predicates": [
"label hasContent .description"
]
},
"editBox": {
"predicates": [
"NewItemField hasContent .description"
}
}
},
"data": {
"NewTodo": {
"description": "Hello world"
},
"todos": [
{
"description": "todo one"
},
{
"description": "todo two"
},
{
"description": "todo three"
}
]
}
}See https://elaeis.cloud-angle.com/?p=71 and https://github.com/samsquire/additive-guis
I couldn't agree more. I know a lot of this kind of "semantic web" stuff has some pretty vocal detractors and that adoption seems limited, but I still think there is a ton of "meat on this bone". There's just too much potential awesomeness here for this stuff to not be used. I think this is an example of where incremental refinement is the name of the game. As computers get faster, as we get more data, as algorithms improve, etc. we'll get closer and closer to the tipping point where these technologies really start to reveal their potential.
Another example, demonstrating the querying for the relation part, would be to find Leonardo DaVinci's family members: (again in pseudocode so you don't need to dwell in the syntax)
fetch ?kinName, ?linkName given
?link "is called" ?linkName.
?kin "is called" ?kinName.
"Leonardo DaVinci" ?link ?kin.
?link "is" "familial".
Line 3 was the "mindblow" moment for me, you can ask how two objects are related without knowing either one! (though I did know one of them in this example, Leonardo)In fact I would love to know if someone else does have any other resources too :)
I'll second the recommendation of Jena (and associated sub-project Fuseki). If you know Java (or any JVM language) you can use the Jena API directly for manipulating the triplestore, and submitting SPARQL queries. If you don't want to do that, Fuseki exposes an HTTP based API that you can interact with from any environment you prefer.
Ask HN: What novel tools are you using to write web sites/apps? - https://news.ycombinator.com/item?id=26693959 - April 2021 (320 comments)
Ask HN: What startup/technology is on your 'to watch' list? - https://news.ycombinator.com/item?id=25540583 - Dec 2020 (248 comments)
But not in very cold environments. I have one and when there is more than two degrees of frost it struggles.
So for a lot of continental areas they are almost useless since they do not function when you really need them.
For temperate climates and coastal regions they are wonderful.
There’s a massive difference in capability and efficiency and usability of heat pumps. The crappy ones don’t even work below freezing. The good ones can operate even down to -20F efficiently, even air source. And ground source ones don’t have a hard limit at all (although they face a similar wide difference in capability).
Low effort cheap heat pumps are gonna do more harm than good in that they’ll convince people that heat pumps suck.
It’s like the difference between a Tesla and a lead acid golf cart. Both are “electric” “cars”, but there’s vastly different capability.
Granted my shed has quite good insulation, but still.
Worst comes to the worse you could use a ground source heat pump.
This depends on what you mean by "cold" and there are several other factors as well.
For one, they work great below 28 degrees. For the area between 28-34, there can be issues. In this range, water will more readily condense out of the air and form ice on the outdoor unit. And if it is raining and it is 34 out, you'll really have some ice.
But below 28 degrees... any water in the air is already "frozen" and you aren't going to have as big of an issue of ice spontaneously forming on a colder surface.
As long as you can get air flow across the coils on the outdoor unit you are fine, in one sense, "the colder, the better".
But that leads to the next issue: what to do when you do have ice blocking air flow? And this is when price comes into play. To my knowledge, all of the lower tier brand names (goodman, payne, bryant, maybe even ruud and rheem) will use a timer based defrost control. Basically, once the outdoor coil goes below 32 degrees, a switch is tripped and a defrost cycle will be forced after 60 minutes, whether there is ice on the unit or not. When temps are below freezing, a defrost cycle could easily take 20 minutes of extra runtime to recover the temperature. Even worse, if snow is drifted up against outdoor unit, a defrost cycle will cause it to melt into the coils which will turn into ice once the defrost cycle is over. So an unnecessary defrost could take a completely ice free outdoor unit and leave it with one side caked in ice.
To combat this issue, most of your top line brands (Trane, American Standard, Lennox, Carrier) will have "on demand" defrost so you might very well go 4+ hours of runtime and never see a defrost. However, each brand has their own quirks and can still end up with unnecessary defrost cycles if the air flow through the indoor unit (dirty filter) is poor or if the system refrigeration charge is not 100% perfect.
The other thing that seems to get people is run times. In the south in 100+ degree temps, you can expect your A/C to run for 12 hours in a 24 hour period. Yet for some reason, when a heatpump runs for 2+ hours straight when it is 20 degrees outside, people flip out that it's running too long and going to blow your electric bill up so they flip it over to electric only/emergency mode.
Let's do the math... your heat pump is running for hours on end, drawing 3kwh every hour. You freak out and flip it to emergency mode which turns on a 20kw electric heater and the unit now runs for 30 minutes followed by a 30 minute off cycle. You think it is only using "half" the electricity because it is running half as much. But the reality is, you are now using 10kwh every hour.
Call me crazy, but I don't think it was politics or frozen natural gas that lead to the 2021 Texas blizzard power outage, but people with heat pumps that have no idea what they are doing. Even before temps were freezing, several local community groups on facebook had people spamming "it's going to get below 32 tonight, so for you heat pump users, make sure you put it in emergency heat mode!". And then to make matters worse, local HVAC companies, with the large influx of people complaining their heat pump had been running for "hours on end", start chiming in saying "it's too cold, go to emergency mode"... Meanwhile, I'm somewhat new to heat pumps myself, but I had forced mine to only use electric heat when in defrost mode. It ran flawlessly, I was very impressed. The vents were blowing 90 degrees all the way until it was 20 degrees out. Once it was 6 degrees out, it was blowing 81 out the vents, but still enough to hold temperature in the house (66). On the worst day, I had a combined runtime of 16 hours with 7 defrost cycles. My bill for the whole month was $140 (the highest ever), while all my neighbors that tried to "save" money by going to emergency mode had bills of $300+.
But heat pumps in old homes just aren't a thing because it's expensive and a lot of work to adapt the house's existing heating. People do understand it, they just don't want to bother with that.
But retrofitting requires a lot of work - replacing radiators with much larger ones, maybe ripping out pipes, and for ground source digging up the garden / street.
This is already happening. Redrow for example are focusing on new heating technology for new homes in preparation for the boiler ban
> I honestly think part of the reason they are not adopted as much is people can't understand them and don't trust them because of their ignorance.
I think there’s genuine reason right now.
For existing houses it only makes sense if you have really good insulation which rules a lot of people out. That’s why they’re only being focused on new builds, so they can guarantee the insolation is adequate.
They’re expensive (for existing housing) and the returns aren’t quite there yet. Parts and expertise are also quite limited compared to boilers (the market isn’t really there yet).
It also depends what climate you live in for how useful they will be
But like your article says the installation is too expensive for the expected savings. There is another problem you didn't bring up and is scalability, geothermal works fine for detached houses but not so great in a city where most of the energy goes to.
District heating is a great solution for denser population areas, here we burn trash and solve two problems, and in theory it can be combined with Geothermal heating
Some people say that there's nothing new in it, but to my mind, they're missing the point : the Berkeley Four took what couldn't be appropriated for profit, and built a statement about what computing is... They revealed the Wizard of Oz to everyone, so that anyone with some computing background can build a processor, freely.
And now this freed wizard is working his magic, and will change the computing landscape irrevocably.
They could already do that. I designed and laid out in silicon a 32 bit processor as part of my undergraduate studies in computer engineering.
Perhaps it will lead to a processor startup, but follow that to its logical conclusion: it takes a huge, profitable company to sustain processor delivery for years. There's a very good reason why only a handful of companies make the top 6 CPU architectures. There's still Synopsys ARC, Renesas, Atmel, PIC just to name a few of RISC-V competitors.
In reality, the Berkeley Four just made a handful of semi companies richer. WDC, NXP, NVIDIA, Microchip, etc. don't have to pay Arm for IP if they use RISC-V. Did that really help anything? Meh.
While I agree there is something not right about cutting into ARM’s profits for the benefit of megacorporations, I think that a royalty-free ISA might genuinely be good for civilization despite that in the same way Linux is. It’s tough though, I’m still not fully sold on that opinion.
There're already designs freely available to use though, either as they are, or to build upon.
And there are also now many other companies designing using the ISA; decentralising the production of chips.
But - over and above the revolutionary economics of it - it's being recognised as a good ISA, and RISC-V cores are already being incorporated into consumer electronics.
Sometime in the last two decades (and again, I'm probably super late to the party on this) it's become extremely affordable to dip one's toe into electronic hardware and embedded software. And not just at the "breadboarding something with an Arduino" level, but at the level of building small production runs of a product that people would actually pay money for.
In a way it reminds me of the mid-2000s era of web technology, where over the course of a few years you went from "putting expensive servers in a data center" to "filling out a web form" in order to host an app in a reasonably high-availability environment.
Or another way of looking at it, a lot of things that maybe you previously had to fund-raise for are now things you can bootstrap, and many things are cheap even by hobby standards.
That means for a lot of projects (for technical folks) you don't need to convince anyone else that your idea has merit, you can just build it and find out.
It could then use IPFS to host "Public Facing" posts. People could pin - or pay for pinning - their posts.
IPFS is what I hope will lead to further democratization of the internet.
Filecoin is a separate thing (mostly), and can (kind of) be thought of as pinning-as-a-service. It's built on a private IPFS network, not the main public one most people use or are directed to. So it's using IPFS, but it is not IPFS.
Briefly it's a genuine, and scientifically uncontroversial, form of 'cold' fusion enabled by muons — a more massive relative of the electron that was in the news recently thanks to the potentially interesting results coming out of the Muon g-2 experiment at Fermilab [2].
Like conventional 'hot' plasma fusion, in all experiments to date the energy input needed to sustain the process has exceeded the output, but it may be possible to use it to generate power. Unlike conventional fusion though, it receives relatively little attention, and there is no well-funded international effort to tackle the associated technical challenges. As with conventional fusion the technical challenges seem formidable, but it could be an interesting technology if a way could be found to make it work.
Listeners of Lex Fridman's podcast may recall that it was briefly mentioned in the episode he made last year with his father, Alexander Fridman, who is a plasma physicist [3]. As someone who has been interested in the idea for years and barely hears any mention of it, I was pleasantly surprised it came up.
It was also covered on the MinutePhysics YouTube channel in 2018 [4].
[1] https://en.wikipedia.org/wiki/Muon_catalyzed_fusion
[2] https://www.youtube.com/watch?v=O4Ko7NW2yQo
[3] https://www.youtube.com/watch?v=hNCz-8QIWuI
[4] https://www.youtube.com/watch?v=aDfB3gnxRhc
Bonus fact: Muon-Catalyzed fusion was first demonstrated by the Nobel laureate Luis Alvarez, who, with his geologist son Walter Alvarez, later proposed the 'Alvarez hypothesis' for the extinction of the dinosaurs by asteroid impact.
-A Software Engineer
https://www.arpa-e.energy.gov/technologies/projects/conditio...
Are there recent developments in this field that change that?
https://techtransfer.universityofcalifornia.edu/NCD/24852.ht...
Unless we figure out some awesome hardware acceleration for it, it's not practical but for a few niche applications.
It also has the problem that you can use computation results to derive the data, if you have enough control over the computation (e.g. a reporting application that allows aggregate reports).
1 Zero-knowledge proofs,
2 shielded ledgers,
3 democratized and energy efficiency mining,
4 inflationary control, and
5 wallet recovery.
No one has all of these yet, but ZKP is a big part of it.
If we could simulate and observe what happens with complex chemistry accurately, it would completely change the biology, medicine and materials science.
That's probably far less true than you imagine. See Derek Lowe's take on it: https://blogs.sciencemag.org/pipeline/archives/2021/03/19/ai...
The rate-limiting steps in drug discovery is in figuring out a) what you need to muck up to improve health or b) how to muck it up without mucking up other things badly enough to kill you. Computational chemistry has generally focused more on solving problems c) how to muck up this target more effectively or d) how to make the mucker-upper in the first place, which, while not useless, is not going to be a revolutionary change by any stretch of the imagination.
There is the rub. Biological simulations have been written for 40 years now. It's an extremely difficult problem considering how many latent variables are at play, and people have been working on it for a very long time now.
ML techniques will be used to cut that latent variable space in both quantum chemistry and molecular mechanics based methods.
In the early 2010s, I was an undergraduate electrical engineering student with type 1 diabetes playing around with such models by reprogramming stuff that had been presented in peer-reviewed journal articles. I eventually programmed a closed loop control system (also known as "Artificial Pancreas System") as a spring break project to inform my insulin dosages. Mostly, it was a soul-searching project as engineering school was physically enduring for me, as I have serious health problems. I had found a paper about a sliding mode control with respect to type 1 diabetes that looked solvable to me, but I did not know if it was actually solvable. I decided to see what I could do with it, and I was successful, in 2011 and barely old enough to legally drink!
Anyways, I can assure you that while research on control systems is drying up, including for physiological systems, that the excitement is just about to begin for what you mention, starting in about 5 years.
If anyone is working on this, and is looking to hire a computational organic chemist-turned-ai engineer, let me know!
There are Python and and MATLAB bindings, in fact MATLAB 2021a now uses SuiteSparse:GraphBLAS for sparse matrix multiplication (A*B) built-in.
Oh wait, that's under-the-radar technology.
Stripe + TaxJar was cheaper and easier to implement and maintain.
It's being used in new tokomak fusion reactor designs, like SPARC.
https://en.wikipedia.org/wiki/Rare-earth_barium_copper_oxide
More broadly, decentralizing insurance in this way would be very cool too... There's little difference, in my mind, between a prediction market predicting weather changes or elections, and insurance contracts around risk.
... and what's even cooler is: can we build bots and models to actually get an edge on these predictions? Imagine applying HFT strategies from stocks to predicting real-world events... Now it sounds like we can actually get good at forecasting difficult-to-predict human events, rather than just stock prices.
If you’re in the US there is a regulated prediction market set to launch soon.
Do you have more info about the regulated prediction market? I'd love to learn more.
There are definitely a few players in this space and I'm excited to see where it goes.
- jamstack.wtf
- federation-based networks
- CRDTs: https://josephg.com/blog/crdts-are-the-future/
- data-oriented programming paradigm (https://rugpullindex.com/ shameless plug)
- web components: https://docs.ficusjs.org/index.html
e.g. lemmy.ml
It reminds me of a Star Trek tricorder. Imagine having a camera where you can see easily ID greenhouse gases, quantify water/fat content in food, identify plant diseases, verify drug components, identify tumours, and measure blood oxygenation. On the machine vision side of things: it could probably outperform any conventional imaging + DNN combination, and you'd probably get pixel-wise segmentation for free while you're there.
There's been a lot of academic progress going on - it shouldn't be long until hyperspectral imaging makes its way into our lives.
https://news.ycombinator.com/item?id=20985429
https://news.ycombinator.com/item?id=20394166
I think it's inevitable that darklang's vision will be achieved eventually, at least in part, whether by darklang or by other projects. We are already at the stage where you can define your infrastructure in code, and execute functions on managed "serverless" runtimes. It's not too much further to the point that cloud providers will build tightly integrated developer experiences that allow a developer to "just code" while handling all of the complexity that comes after. Within some large software companies, there is something close to this experience, but it hasn't yet been wrapped up and sold to the public.
Wireless VR really is a necessity for it to become more than just a tiny niche.
Not yet mainstream, but it’s actually a joy to use and I think we’ll have significant marginal improvements over time which will keep making it more and more worthwhile.
I think the real thing will be commercial applications of VR, where companies use it because it’s the best way to get certain kinds of work done. And NOT desk work, either. We’re maybe a decade or two from that being mainstream, but it’s going to be a significant improvement.
No more ceramic implants, no more root canals. Grow new shiny and healthy teeth.
I haven't been able to really get into FPGAs but I'm optimistic about them. They're pretty clunky right now but I'm hoping they'll just get easier and more accessible.
If we want computation that we're able to verify as being secure, FPGAs are the closest I see us getting to it. There's also applications in terms of parallel computation that they can do over more traditional approaches.
This might go by the way of Transmeta or always remain niche but it seems like they have a lot of potential.
* Open Source Hardware, specifically relating to robotics
Electronics production is becoming cheaper and easier. Open source hardware has the potential to become as ubiquitous as open source software.
Electronics hardware is still way harder than it needs to be, so the progress is slow, but if we get within range of having an iteration cycle in electronics that's as fast as software development, we'll see spectacular innovation. Robotics especially, as that's a kind of straight forward physical manifestation of electronics that has a potentially large market.
There's a $5k fiber laser that can ablate copper. This could potentially fuel the next round of cottage industry board fab houses (in the US and other non-Chinese countries) and enable rapid turn around time. I wish I could justify the $5k to play around with it but it's just outside of my price range.
* Solar
I'm not really sure if this is 'under-the-radar' but for the first time, solar has become cheaper than coal. This means besides giving a moral incentive for people to use solar, there's now an economic one, which means the transition will most likely be broad.
Coupled with battery technology advances, this could have drastic impacts on the ubiquity and cost of power. I wonder if we'll see a kind of "energy internet" where people will create their own distributed electrified infrastructure.
May I suggest you take a look at microROS[3]?
I am also super excited about OSHWA certified open hardware [4].
[1] https://certification.oshwa.org/ [2] https://www.openrobotics.org/ [3] https://micro.ros.org/ [4] https://certification.oshwa.org/
I'm also excited about the OSHWA certification. I've found a bunch of great projects through it.
I've been passively watching ROS but it's always seemed a bit heavy weight for a lot of the things I'd want to do or for what's available cheaply right now. I'm sure this will get easier as full Linux systems will become cheaper and more ubiquitous for embedded applications.
I haven't seen microROS, though, so thanks for the link, I'll check it out.
For a great recent example that get at some of this, see "Does Your Dermatology Classifier Know What It Doesn't Know? Detecting the Long-Tail of Unseen Conditions" - https://arxiv.org/abs/2104.03829
I'm not affiliated with this work but I am building a company in this area (because I'm excited). Company is in my profile.
https://techxplore.com/news/2021-04-rice-intel-optimize-ai-c...
2. VR and AR In 10 years from now when hardware is capable of displaying 16k per eye in a casual lightweight not bigger than regular sunglasses devices. Everyone will be wearing one making all mobilephones obsolete. Every object, animal or human you watch at will be argumented. It will be the greatest new technological impact since the rise of mobile phones. Changing the live of human being so dramatically
2) Reconfigurable computing - The power of the FPGA without the hassle, a homogeneous lattice of computing nodes as small as single bits allows for fault tolerance, almost unlimited scaling, and perfect security. It offers the power of custom silicon without the grief.
3) Magnetic core logic, initially realized when transistors still weren't reliable enough to build computers out of, may be making a comeback for extreme environments, such as that on Venus.
4) Reversible compilation - being able to work from source --> abstract syntax tree --> source in any language (with comments intact) will be a quite powerful way to refactor legacy codebases in relative safety.
5) Rich source / Literate Programming - embedding content in the program instead of having a ton of "resources" helps reduce the cognitive load of programming.
while(true) {
Thread.sleep(1, MONTH);
sendReminderEmail(user);
}
...which would normally require one to manually keep track of state in queues and key-value stores and idempotent updates, but with temporal the developer can just focus on the simple business rules. The runtime of Temporal takes care of compiling that down into proper state-machine logic.Most operators haven't figured out their business model for NB-IoT quite yet (at least in Europe) - they're still dabbling. Some seem likely to try pair it with enterprise "private APN" type solutions. Under such a setup, you can actually get quite an interesting system in place - the operator locks the SIM to a custom APN, and that APN only allows comms to a managed, operator-provided IoT backend.
Then the operator's enterprise services team turns that into dashboards and other things the customer can use and access. In a sense, they're using "extra slack capacity" on their FDD cellular networks (as an NB-IoT carrier is only 200 kHz wide and can sit inside or adjacent to a regular 4G or 5G carrier), and delivering higher margin managed enterprise services.
Some other comments point out the potential to use LoRa - indeed, although if you can use LoRa, you probably aren't the target market for NB-IoT. If you want to deploy 50 million smart meters, a nationwide existing network and managed service from the operator starts to get appealing, as does them handling security, isolating the devices onto a private APN, and helping you update devices in the field.
If you are using LoRa, you need to handle this and deploy the infrastructure. To date though, I've seen lots of "unmanaged" NB-IoT testing taking place, but not a whole lot of the "full service managed offering".
Otherwise I would agree entirely with your point about connecting modern IoT devices to the internet, but in this case I think it will end up for enterprise type deployments where they're restricting that for you.
nbIoT is justified if I know that the data volume of my "solution" might increase due to feature/scope creep (and replacing the battery/sensor isn't going to become an annoyance in 2-3 years at end of life).
For most use-case LoRaWAN makes more sense but doesn't have the same marketing budget that is available to T-mobile, Vodafone and Co.
When I did my postgrad research project, back in 2016, I was using LoRaWAN and thought it was so obviously going to be huge in e.g. AgriTech. Surprised not that much has happened with it tbh.
It’s queriable like a database but it doesn’t store your data - it proxies all your other databases and data lakes and stuff, and let’s you join between them.
Trino is a great example.
Aren't stuff like data lakes and warehouses supposed to address the need for a centralized datastore?
Outside of perhaps an easy-to-apply interface, what benefit would a data hub provide over just streaming duplicates from all of your databases into a single data lake like Snowflake?
We used to have to copy and shuffle data into centralized systems so we can query and join it.
Data hubs do away with all that. Stop needing to think about storage. Instead, it’s just about access.
There have always been fringe tools, eg I once did some cool stuff with MySQL spider engine. But modern systems like Trino (formerly called Presto) make it easy. And, I predict, will hit the mainstream soon.
(not affiliated in any way). https://trino.io/
https://materialize.com/lateral-joins-and-demand-driven-quer...
Started learning Clojure / ClojureScript and keeping an eye on ML languages like ReScript and ReasonML.
I wish soon I'll be able to never write JS/TS code again.
I talk about this more here: https://news.ycombinator.com/item?id=26084187
I think traditional tokamaks are 5-10 years from positive power due to better superconductor tech. There is finally private investment in the space and it has been growing at an absolutely crazy pace.
I think in about 5 years there is going to be a fusion power gold rush.
https://phys.org/news/2021-04-hydrogen-fuel-machine-ultimate...
https://newatlas.com/energy/osu-turro-solar-spectrum-hydroge...
https://uh.edu/news-events/stories/2017/April/05152017Ren-Wa...
And there is currently a very large PR effort by fossil fuel companies to promote hydrogen. I'd suggest extreme skepticism about any "news" promoting it at present. Always ask where the hydrogen is actually coming from in the present, not 30 years down the road.
Example company creating anti-COVID solutions with it: https://www.zengraphene.com/
Both technologies totally redefine what we mean by "sequencing a genome" and open up broad categories of mutations that are completely invisible to more common forms of genotyping or sequencing.
Also anything that is multipurpose. Rope + tarp = shelter, sunshade, awning, hammock, sail, etc. One gadget cooks and chops etc.
Oh and household robots. Already have vacuum mopping and pool robot. Considering a lawn robot. Clothes folding can’t be far off right?
Siri is underrated in my circles, I hardly see anyone use it. Social anxiety of yelling at your phone?
It's a novel "Proof of" algorithm (Proof of Space and Time) that front loads the resource needs into a Plotting phase, with a very efficient Farming phase after that to process blocks with transactions. Seems like a much more fair, sustainable model for having a secure digital currency.
It also has an interesting, Lisp based programming language on it.
But what excites me is that it's lead by Bram Cohen, the dude who invented BitTorrent, one of the best pieces of tech I've used nearly my whole tech life.
That being said, yes, with the right amount of investment, someone could try and take over the network. That being said, look how many Full Nodes are already in the network...
The idea that you can do a sustainable cryptocurrency that remains sustainable no matter how valuable the tokens become in real money terms is self-evidently ridiculous. There's always some limiting resource you'll hit first, and if the cryptotokens are worth real money, that resource will get scarce.
But Chia is a good example of a brilliant person being so seduced by a challenging technical problem that they lose any ability to see foundational problems that people with a tenth of the brampower would be able to spot instantly.
If each "bit" of DNA can be either A, C, G, or T, why call that binary?
TLDR: We can now make lenses in any shape we please, not just with parabolas and circles (kinda).
Should have implications for anything that has to do with light: Telescopes, lasers, com-sats, AR/VR, etc.
https://gizmodo.com/a-mexican-physicist-solved-a-2-000-year-...
2) Memristors. We've not found a cheap and stable little component yet like the rest of the 2-lead elements, but it seems to be on the way (says every futurist)
TLDR: We'll be forced to re-do a lot of computer HW as the memristive ones will be (likely) much faster and cheaper on power. Think coin-cell batteries powering very good image recognition systems cheap as a dollar-store watch.
Microsoft and Oculus have hands free controls that actually work. Inside out tracking is progressing quickly. New UX patterns are getting forged.
I'm exited to see what we'll have in a few years time. In my mind its far more exciting than something like crypto but gets much less press.
The protocol offers blockchain-based smart ticketing which eliminates fraud and prevents scalping. This has the potential to get huge when events start coming back post-covid.
If it is for anything other than very LOW power (microwatts), you're going to be disappointed.
It is essentially a beta emitter hooked to a capacitor via some electronics to handle voltage conversion. The thing is the beta source is use it or lose it, and very low power. If you scaled this up to power a Tesla, it would be a nightmare, as it would need to dissipate the full power the car requires, all the time, or it would melt down (aka Fukashima)
For a longer debunk - https://www.youtube.com/watch?v=uzV_uzSTCTM
https://thinkingagriculture.io/the-agriculture-unicorn-hidin...
I think a lot of really cool innovation is going to come out of easily transmissible programmable money.
I'm the only one I know trying to do this. It's changed my life. I'm now applying my ideas in how to choose to lovingly coevolve with partner and the 2.5-year old we conceived. The results from this experiment are getting to the point that people are noticing. There exists a unifying spiritual path through my (mis?)application of category theory in my daily life. I am a noob at hacking the human and I'm saying that after having had major successes within this human body. I also recognize I'm doing way more than many people, so if I'm a noob, so are most-if-not-all people. The Buddha is an example of an elite hacker of the human.
Still waiting for people willing to take the first step, which is cultivating an ideal learning environment within oneself. This means learning to abandon judgments by default.
This will replace Facebook, YouTube, Cloud, Google, Android, everything. (In a millennia or so. )
now.. the race to see if we can fill it with normal stuff instead of letting conspiracy theorists and racists flood it.
Ummm... to start with "what everybody else has already said." If I have anything to add it might be (and somebody might have said this already as well, and I just missed it)
Synthetic Biology - this entire field fascinates me, and I expect big things to come in the future when we can customize DNA and grow items we need, that are tailor made to various parameters. This is also the beginning plot-line to many horror novels and movies though, so "everything isn't rainbows and sunshine" as they say.
Nanotech - related to above, but as with synthetic biology, it fascinates me to think what we can do when we have atomic scale self-assembling machines.
AR/VR - maybe not "under the radar" anymore, but I think there's a ton of untapped potential in this space.
Semantic Web - Yes, I'm still a believer in the idea of RDF / SPARQL / etc. I've said enough about this in the past, so I'm not going to drill any deeper here.
AI - maybe more "AGI" than the ML stuff we have today that gets labeled "AI". And saying that is not an attempt to denigrate ML or any of the radical stuff going today. It's just that for as much as contemporary "AI" can do, I think there's a lot it still can't do, and I like to daydream about the potential of AI's that get closer and closer to (and exceed?) human abilities. See above about horror movie plots though. :-(
Fusion: this has definitely been mentioned already, but add me to the list of people who are hopeful/excited about the prospects.
Time Travel: Actually no. I kinda hope this is impossible. I have a feeling that if unrestricted "Doctor Who" like time travel was possible, causality would collapse and all of reality would just become a big, jumbled mess, incapable of supporting life.
Green mobility concepts like: - self driving shared cars for first and last mile - self driving trains and busses for urban transportation - self driving high-speed trains and self flying airplanes for longer distances
Yes, most of you have heard of it, but I think it is still very underrated.
Especially interesting is the ML libraries that have come out recently and OTP24 whose new JIT compiler gives a ~2x speed improvement to Elixir code, depending on the task.
We've learned to optimize our bodies through nutrition and physical fitness (even if not everyone does it, we have the know how), but our brains are the next frontier.
I've seen lots of snake oil in this space so far, I was going to link to Halo Neuro, but they've been acquired by a tCDS company - from what I understand, the technology isn't ready yet.
We're building a sleep headband that monitors your sleep state, and uses sound to improve your Sleep Performance -https://soundmind.co
Others in this space are Emotiv and Muse, Dreem
We're probably going to see a wave of disruptions from technologies like GPT3.
For example, we might see something like this in the long term:
A) Someone will create a model to accurately convert low level source code to higher level source code that does the same thing when compiled down. Think assembly / machine code to high level code or even English descriptions of the underlying semantics.
B) At this point, why not pipe in some DNA/RNA into the model from A) to get high level insights
C) Give it a couple iterations and it might be possible to create a compiler. For example... C to RNA
D) Finally... solve problems by creating sequences from scratch instead of re-using bits from mother nature
If we ever do get to D), I sure hope no country tries to use this in a terrible way...
Blockchains are secure systems because they're isolated systems. But smart contracts aren't very exciting without data from the real world. Oracles are the bridges to supply data from the real world to the blockchain world.
However, a system is only as strong as its weakest link. You'd want the same security guarantees as what blockchain can provide. So the blockchain needs to "trust" oracles to deliver the correct data that's immune to manipulations.
With the rise of smart contracts and full automations, I think oracles will play a huge role in all of this.
The leading project that's working on this is Chainlink.
Oh no, not another term the "crypto" people have taken from the "cryptography" people and re-interpreted in a completely different way. :(
It's an API for Payroll. The number of use cases is pretty amazing!
I believe 3D printing will change the world in less than twenty years. We currently in the hobby-est stage–think home computers in the later 70's early eighties. It took a company to see the potential and package it up for anyone to use. I think there will be a breakthrough home 3D printer that will start whole new industries. You will be able to buy physical products direct from anyone. Anyone can design and sell a vase or a bowl or a boomerang because manufacturing and distribution are no longer barriers to market. Think of when music and video moved from mostly only possible with studios to being able to record at a studio to home recording. Anyways... I'm super excited.
Edit: I should also add something about the 3rd party fabrication / assembly services that are bringing a lot of capabilities to bear, for the average person, who would not have been able to afford them a few years ago. Look at OSH Park, OSH Cut, JLCPCB and the like. Need PCB's? Done. Need later cut metal parts? Done. There are similar services for injection molding, etc., etc., etc.
I'm pretty excited about this space. I just bought my first desktop mill (so new I haven't even unboxed it yet) and my first 3D printer. And picked up a cheap Black and Decker convection oven to convert into a reflow oven about the same time. Definitely excited to start exploring the intersection of all of these tools for "building things" without needing a full fledged machine shop, wood shop, yadda, yadda, yadda.
Alternative data especially for investment decisions
In the future, geometric algebra will likely be part of everything we do, but it is still very unknown as of now.
I think a lot of people are only allocated one monitor in most industries. This will change the way they work.
It will also change design.
The fact that there isn't more discussion around the cryptography, networking, etc. suggests to me that many are still unfamiliar with the power of the underlying technology.
I am pretty excited about Ethereum and related ecosystem.