Release Neutron Highway long range route planner

Hello! o7

Little suggestion: add N° of Bodies to either Main Fields or Fields to Show when searching for systems.

As a fleet carrier owner, I often find that my target system is full and my FC can't jump there, so I search for nearest systems with only one body. That way I'm sure my FC won't land some 300K ls away from the main star.

It would be cool if we didn't have to click on every result to check the N° of bodies of every system.

Maybe this has already been suggested or it's already implemented and I don't know how to do it.

If the latter, please tell me how. Thanks! o7
 
Hi Spansh!

From a fellow web developer, tip of the hat to your work here. I am impressed by the speed in which your Road to Riches tool works, as well as the UI you've put together. You also do a great job being responsive to your users. I had a few questions:

1. Feature request: Target Credits. For example, I want to save up 100,000,000 credits for some ship outfit. I'd like an optimized circular route to hit that 100,000,000 target. I understand having another metric to optimize around adds complexity to the algorithm. Traveling salesman!

2. Have you considered using machine learning to find optimized values for your tool? If not, I might want to take a go at it some time. I understand that there are a lot of variables involved so it may not result in anything useful.

3. Do you open-source this tool? No worries if not, I was just curious about the code. :)
 
Hi Spansh!

From a fellow web developer, tip of the hat to your work here. I am impressed by the speed in which your Road to Riches tool works, as well as the UI you've put together. You also do a great job being responsive to your users. I had a few questions:

1. Feature request: Target Credits. For example, I want to save up 100,000,000 credits for some ship outfit. I'd like an optimized circular route to hit that 100,000,000 target. I understand having another metric to optimize around adds complexity to the algorithm. Traveling salesman!

2. Have you considered using machine learning to find optimized values for your tool? If not, I might want to take a go at it some time. I understand that there are a lot of variables involved so it may not result in anything useful.

3. Do you open-source this tool? No worries if not, I was just curious about the code. :)

1. This is something I could consider. The routing is done after choosing the systems to visit. Essentially it finds the highest profit systems within your chosen radius, picks the number which you selected then finds a route around them.

2. I did look at using machine learning to enhance some of the newer routing algorithms (essentially using them to power the heuristic function for A*). The problem however is training it. It takes a long time to generate actually optimal routes. It's something I may look into in the future, but it's not a high priority.

3. I have open sourced a couple of parts of some of the algorithms at https://github.com/spansh/a-star-router . This is now a bit out of date as I've improved things quite a bit since then and the code isn't particularly clean either. I also use a different spatial index to look things up. I am seriously considering open sourcing different parts of the site based upon tier goals on Patreon (the first being the front end, second being parts of the back end code and the third being the entire codebase). I'm just not sure where I'd want the tiers to be, since the site still doesn't quite pay for it's own hosting currently. That said, I'm happy to discuss parts of the algorithms and explain how they work and integrate with the various technologies I use.
 
[ENHANCEMENT REQUEST]

Road To Riches tags or excludes systems within 20Ly of source system so commanders call sell data there.

I created a new Commander recently, and I've been leveling up engineers with alternative methods to reduce resource usage. Several engineers take cartographic data as the alternative, so it would be helpful to have an option to exclude systems that are too close to the source system for sale there. This could take several forms:
  1. A checkbox for "Sell data at source system"
  2. A text box for the "Sell At" system with the 20Ly exclusion zone centered there
  3. A column in the target system list for how far it is from the source system
  4. An additional slider for an exclusion radius with a "notch" for 20Ly
The checkbox is probably the most straightforward and self explanatory with the "Sell At" text box being the most flexible.

--

Great tools, keep up the good work!
 
1. This is something I could consider. The routing is done after choosing the systems to visit. Essentially it finds the highest profit systems within your chosen radius, picks the number which you selected then finds a route around them.

2. I did look at using machine learning to enhance some of the newer routing algorithms (essentially using them to power the heuristic function for A*). The problem however is training it. It takes a long time to generate actually optimal routes. It's something I may look into in the future, but it's not a high priority.

3. I have open sourced a couple of parts of some of the algorithms at https://github.com/spansh/a-star-router . This is now a bit out of date as I've improved things quite a bit since then and the code isn't particularly clean either. I also use a different spatial index to look things up. I am seriously considering open sourcing different parts of the site based upon tier goals on Patreon (the first being the front end, second being parts of the back end code and the third being the entire codebase). I'm just not sure where I'd want the tiers to be, since the site still doesn't quite pay for it's own hosting currently. That said, I'm happy to discuss parts of the algorithms and explain how they work and integrate with the various technologies I use.

Machine learning is indeed expensive to do. You either have to tie up your own machine with an appropriate (and expensive) GPU, or pay for computing power in the cloud. AWS has some great machine learning instances that aren't too expensive, especially if you use spot pricing.

I understand why you would not open source your work and I appreciate that you have shared some aspects of it.

Thanks for taking time to respond to my questions!
 
Heya @Spansh, Love the data dumps and going through them. The information is fun to do little projects with!

However, with the release of odyssey, your "galactic" sized data dump has reached a ridiculous number, probably because there are nearly 90 settlements in some populated systems, each with it's own unique station-sized data, and I think it's made the data dump kinda... ridiculously huge.

Is there any way we, or well, I can get a new type of dump? Labeled "exploration".

Unique benefits of this "exploration" dump:
  • No station Data
  • No Carrier Data
  • No commodity, combat, faction, etc data.
  • Only celestial object / biological / geological / exploration based data

honestly, just a reversal of the "populated" dump :D

Thanks, Love all the work you do!

Arboo
 
Heya @Spansh, Love the data dumps and going through them. The information is fun to do little projects with!

However, with the release of odyssey, your "galactic" sized data dump has reached a ridiculous number, probably because there are nearly 90 settlements in some populated systems, each with it's own unique station-sized data, and I think it's made the data dump kinda... ridiculously huge.

Is there any way we, or well, I can get a new type of dump? Labeled "exploration".

Unique benefits of this "exploration" dump:
  • No station Data
  • No Carrier Data
  • No commodity, combat, faction, etc data.
  • Only celestial object / biological / geological / exploration based data

honestly, just a reversal of the "populated" dump :D

Thanks, Love all the work you do!

Arboo
I think you're overestimating how much size stations add to the file. The stations dump is only 500MB zipped (same as the populated one), I strongly suspect that you'd only save a few hundred MB at best from the filesize. You'd be better off simply processing the file line by line rather than trying to do the same thing (I do process that file myself every day using a streaming JSON parser, it takes around 3 hours including some preparation work like downloads and data patching 9:00 to 11:51).

It wouldn't take much for me to add the dump (since I generate all those files in parallel) but I think you'd still be looking at a 42GB file compressed.
 
doing exobiology, i would love to have a route calculator with atmospheric planets to land on. just for exobiology reasons. thanks for having a topic to post in!
 
Hey! Love the toolset you've written. Just a small feature request? Fuel Scoopable stars. It is handy to be able to route to stars I can fuel scoop along the way.
 
Top Bottom