Welcome, Guest. Please login or register.

Author Topic: a wacky idea - network generalized distributed computing  (Read 2341 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline ptekTopic starter

  • Sr. Member
  • ****
  • Join Date: Jul 2002
  • Posts: 328
    • Show all replies
a wacky idea - network generalized distributed computing
« on: December 24, 2006, 07:49:49 PM »
A few moments ago I just had this (maybe) wacky idea :

Unlike projects as SETI@home and other distributed computing systems that takes use of computers across networks (Internet included, of course) in order to create a gigantic computing system for a specific task, why not to create a client/server system where any CPU intensive program (like a mpeg2 encoder/DVD authoring) could be individually allowed to share processing resources across the network ?

The second option would be more demanding to come true : to change an existent OS (AROS, for example) or virtual machines like QEMU or UAE (considering UAE as a virtual machine) to make use of the distributed computing.

The third option could be in the form of a library (or DLL on the PC winblows) that could be called to send out processing tasks to the network and wait for the results)

This is not necessary Amiga specific... It just a thought about distributing computing. But if applyied to PC or Amiga, it could allow the conversion of a DivX to DVD a snap, for example.

Does this already exists the way I just described ?
Could this be done ? I think so, even if it would demand a inteligent managment of task, in order to avoid unnecessary "freezes" (the tasks of window dragging/moving and some common things doesn't need to be distributed and shouldn't)

This should be developed as an open source project of course to avoid any suspicion on privacy of the data exchanged.
Onions have layers ...
 

Offline ptekTopic starter

  • Sr. Member
  • ****
  • Join Date: Jul 2002
  • Posts: 328
    • Show all replies
Re: a wacky idea - network generalized distributed computing
« Reply #1 on: December 25, 2006, 12:31:42 AM »
Quote
Screw you. You can have my CPU cycles when you pry them from my cold, dead fingers.


It seems that you didn't catch my idea. I refered the existence of a CPU usage manager, so the idea was not to use 100% of the other's CPU but only slices of it. Otherwise, would be unfair.

BTW : I find the "screw you" expression an offensive one. Respect the others if you want respect for you.
Onions have layers ...
 

Offline ptekTopic starter

  • Sr. Member
  • ****
  • Join Date: Jul 2002
  • Posts: 328
    • Show all replies
Re: a wacky idea - network generalized distributed computing
« Reply #2 on: December 25, 2006, 12:44:00 AM »
Interesting stuff, the Beowulf ...
Onions have layers ...
 

Offline ptekTopic starter

  • Sr. Member
  • ****
  • Join Date: Jul 2002
  • Posts: 328
    • Show all replies
Re: a wacky idea - network generalized distributed computing
« Reply #3 on: December 25, 2006, 01:09:03 AM »
Quote
To decrease the inefficiency, you'd have to send large large chunks of data instead of swapping bits back and forth


Not all task should be distributed, even if this system would be integrated on the OS... That's why the need of a CPU manager to decide the need to handle the task locally or not.

I think the approach of selecting only the programs which we wanted to take advantage of this "farm" would be best, since there would be an unavoidable delay between task requests and their completion by the other remote computers. Of course the data to be processed would have to be sent is some quantities and not bit by bit, along with the "instructions" of what to do with it for the remote(s) computer(s) selected to handle it. There should be a correct management of their resources, so only close to idle computers would take the work, but never taking 100% of their CPU ! Everyone who decided to join the "farm" should benefit, like the ones who are at P2P SW like bittorrent.

Besides, I thought this working multi-platform. So this could be a good use for intel/AMD many Ghz power machines to do in their idle times. Classic Amigas with theirs motorola CPUs would suffer a bit. Maybe AROS running on x86 would be a better scenario.

The difference from SETI@home is that any program/task may be distributed. SETI only handles SETI task :)

This are just thought of someone (me) that doesn't have any pratical or theoretical knowledge of distributed computing, so I named the idea like "wacky". But, who knows, maybe some of what I described here could work.
Onions have layers ...