[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: URLfs ?



Ira Abramov wrote:
On Thu, 31 Jul 1997, Erez Doron wrote:

> here is a mail i sent to the one who maintains the ext2fs in the kernel
> ( i didn't found sombody better so i mailed him )
>
> look at it, and i'd like to hear your comments

well, URLfs is a cute idea, but I'd expect it to come from Microsoft
before Linux. I suspect that with all the cute stuff it can do, it's also
a very high security hazard, and I'd never install that on my machine.

security hazard , why ?

Also, handshaking with remote sites is sometimes slow or problematic, over
half of lynx' code is dedicated to this, I'd suspect (I give lynx as an
example because it's pure retrieval engine, and almost no interface, at
least compared to graphic browsers) and that means a chunk of 250k at the
least added to the kernel, with no guarantee of stability. It will be
pretty embarassing to have your kernel freeze because of a PPP line
disconnection or something... nope. such high-level protocols as FTP and
HTTP must not be handled at Ring 0.

well, the kernel has to wait for the cd/floppy to spin up and it does not freeze then,why should the kernel freeze in this case.

there is an opertuninty to use a daemon or loadble model for it,
this will leave the kernel small if url not used, and will load the
code once ( nowdays if you run 5 lynx and 3 ftp together, you get the code 5 times, instead of once, as if it was if the url block was in the
kernel)

making a urlfs will make every aplication support urls ( like mc, xfm
kfm and amy other file manager or utility ) without rewriting or even recompiling the aplication.

( the urlfs idea came to me after i saw the KDE enviroment, which their
  filemaneger KFM, adresses local files, http ftp and even tar.gz and
  man pages in the same way ( try opening file man:ls and you get it
  troffed )

 

OTOH, you could have the shell support it, that's not such a big hack as a
kernel module, for all I know you may even be able to create such a macro
for an existing shell (zsh?), or take the sources of another, tell it to
fire up a tiny external requester (ftpget, webcopy and others already
exist), each time it finds something that looks like a URL while
parsing...

(note:
if anyone has programming frenzy burning in his bones and nothing to do
right now I have a REALLY cool idea I have no idea how to implement...
something many linuxers will be thankfull for for years to come :-)
 

--

Regards
Erez.
          ___                                              ___
          L_|_                                            _|_J
         ( -O>                                            <O- )
      ___//\J +------------------------------------------+ L/\\___
     //-,\    | Erez Doron,                              |    /,-\\
    || / \\___L    U.S. Robotics Technologies, Israel    J___// \ ||
  _ ''/\/ '---J Email:                                   L---' \/\'' _
 / \ //\\.    |    erez@scorpio.com                      |    .//\\ / \
|_/\'/  ||    +------------------------------------------+    ||  \'/\_|
    '   ||_                                                  _||   '
        |__)                                                (__|
 


Follow-Ups: References: