Linking two computers
-
- Posts: 162
- Joined: Sat Jul 31, 2004 4:00 pm
- Location: England
Hi
I like working in 96k with lots of native convolution reverb, and as you can understand this annoys my processor no-end. I really like the Scope system, and I have a high-quality Pulsar II card in my computer.
Anyone got any ideas on what would be a good way to link two computers using system link?
With VST system link, you can link any two computers with a ASIO soundcard with a digital I/O. I have a PulsarII Plus with balanced plus AES digital I/O. I can't afford another PulsarII to fit in my second machine if I go through with this project, so what other soundcard could I use? I believe that my soundcard has SP-DIF (correct me if I am wrong) but can't find an output for it...
Could someone suggest the recommended setup for the computer (inc. what hardware to use) and how to link the soundcards...
I like working in 96k with lots of native convolution reverb, and as you can understand this annoys my processor no-end. I really like the Scope system, and I have a high-quality Pulsar II card in my computer.
Anyone got any ideas on what would be a good way to link two computers using system link?
With VST system link, you can link any two computers with a ASIO soundcard with a digital I/O. I have a PulsarII Plus with balanced plus AES digital I/O. I can't afford another PulsarII to fit in my second machine if I go through with this project, so what other soundcard could I use? I believe that my soundcard has SP-DIF (correct me if I am wrong) but can't find an output for it...
Could someone suggest the recommended setup for the computer (inc. what hardware to use) and how to link the soundcards...
Spdif wouldn't do you any good, nor would aes/ebu. Systemlink requires either an entire channel just for data, or if necessary it can reduce the data-bandwidth of the audio sent down that channel to allow enough overhead for the 'systemlink' control channels. This means that you would effectively have 1 high quality mono channel and 1 lower quality channel. ADAT is what you really want to use, and you'll only get 4 channels each direction if you're using 96khz, and that's only if both devices support double data rate aka. s/mux.
-
- Posts: 162
- Joined: Sat Jul 31, 2004 4:00 pm
- Location: England
My destinations get mixed down to 44.1k I like recording at 96k as I believe that a person should record in the best that is available and keep the masters that way. I wish that I could record in 192k or even DSD (sigh) but SCOPE doesn't support 192k and I'm not even sure if there is any software to put effects on DSD. I believe mastering software for DSD is recent...
Could you give more information or solutions on linking a system with ADAT?
What if I bought a SCOPE home card with AES and added it to my first computer for extra digital outs?
Could you give more information or solutions on linking a system with ADAT?
What if I bought a SCOPE home card with AES and added it to my first computer for extra digital outs?
I'm pretty shure that your 'freewheeling' 96k record sounds worse than a high quality studio-clocked 44.1 kOn 2004-08-10 16:38, cleanbluesky wrote:
... I like recording at 96k as I believe that a person should record in the best that is available and keep the masters that way. ...

Not that I want to convince you to change sample rates, but imho one should keep things simple.
An analogy from sports: no need to race all the time - just be ahead of the competition in the right moment

The Adat connection of 2 systems depends mostly on your recording preferences.
A common example is a Mac running Logic under OSX fed by a RME card (due to the lack of Scope's OSX support).
In this scenario Scope is handled exactly like a piece of outboard gear.
Logic communicates with the Scope devices via midi and the audio is received on 16 Adat channels.
But this can also be setup with Scope cards, preferably with extra Adat plates. Just think of one Scope system as a large external sound/fx module.
again it has been noticed (by people more knowledgable than me) that the 'definition' of sound gains a lot by use of an external master clock.
I'll have to dig out that diagram which showed in a really simple way how clock deviation distorts the sound...
this reads a bit like overkill (and I wouldn't take it too serious), but you started about 'the best' sound quality

cheers, Tom
<font size=-1>[ This Message was edited by: astroman on 2004-08-10 17:15 ]</font>
-
- Posts: 162
- Joined: Sat Jul 31, 2004 4:00 pm
- Location: England
A lot of people have differing opinion about the best/most practical sample rate. Most of it revolves around the idea of whether it is possible to 'hear' the difference between sample rate X and sample rate Y. I don't even care if it's audible, it's best to record with most definition and work downwards from there. My processor disagrees however, and moans whenever I attempt more than 3-5 good plugins. Now if I want say, 16 plugins over 8 channels I am screwed. I am not even sure that two computers would help.
I notice that SCOPE offers I/O extension, which would make things compatible (but expensive) as I would need another SCOPE card and 2 I/O cards.
I notice that SCOPE offers I/O extension, which would make things compatible (but expensive) as I would need another SCOPE card and 2 I/O cards.
well, the clock quality is a crucial part of digital processing - and imho much more important than the sample rate.On 2004-08-11 04:28, deejaysly wrote:
... If I say that I am recording samples via a pulsar input and then everything I do after that doesn't go outside the PC, would I ever experience what you mention?...
I'm really no expert on this topic, as that stuff is bloody expensive - recently 'stumbled' over an Apogee 'BigBen'...

but the facts Apogee uses in their ads have been mentioned here several times, too.
Imho it 's easy to understand with the help of a small graph (I didn't find the original, so I had to reproduce the basic idea)
the green marks symbolize the 'regular' sampling points in perfect timing.
the lilac lines describe what happens if the clock slightly deviates. The system assumes it's at the green timing point and hence miscalculates the amplitude, adding 2 bumbs to the curve.
And THAT effect IS noticable much easier than a different sample rate and it even INCREASES with increasing sampling rate.
The higher the frequency, the less time deviation is tolerable in the clock signal.
At least to a certain degree, until the 'artifact' frequencies get too high to be noticed.
But I dunno if a studio clock controlled Scope card is easily distinguished from a non external clocked one - in fact I don't even know the quality of the clock generator at all.
But I assume those high priced studio goodies must have some use

At least compared to optical (adat) sync it makes sense.
cheers, Tom
<font size=-1>[ This Message was edited by: astroman on 2004-08-11 17:36 ]</font>
<font size=-1>[ This Message was edited by: astroman on 2006-08-18 18:22 ]</font>
Very good diagram! If you use any external device's clock via wordclock, you can only improve sound quality. Some may not hear it but its there.
As for samplerate I agree with the idea of sticking with 44.1. However, I record at 24bit 44.1. When recording at 24bit, you reduce(but not eliminate) another one of digital audio's undesirable side efects, quantization error.
As for samplerate I agree with the idea of sticking with 44.1. However, I record at 24bit 44.1. When recording at 24bit, you reduce(but not eliminate) another one of digital audio's undesirable side efects, quantization error.
-
- Posts: 162
- Joined: Sat Jul 31, 2004 4:00 pm
- Location: England
Thanks, you helped me understand jitter. Worclock is important, although eliminating jitter by my understanding would retain quality rather than imporve quality?
If you are willing to work in 24bit would you not also see the advantage of working in 96k? 44.1k is nice, but not as nice as 96k. I don't claim to hear the difference, but I know that the difference is there.
Andrew Valentine
If you are willing to work in 24bit would you not also see the advantage of working in 96k? 44.1k is nice, but not as nice as 96k. I don't claim to hear the difference, but I know that the difference is there.
Andrew Valentine
your theory sounds good and would be good if resampling wasn't such a violent(to your music) act. much of the gains from wotking at a higher rate will be lost in the resampling process. in fact, the end result can even be worse......more likely, either way sounds fine, so why waste precious processing power, bandwidth and storage for little to no gain? just to brag to friends?
just my opinion based on facts, not a personal attack....
just my opinion based on facts, not a personal attack....
hi cleanbluesky,
have you ever tried
http://www.fx-max.com/fxt/ ?
it works great with sir e.g.
best,
andre
have you ever tried
http://www.fx-max.com/fxt/ ?
it works great with sir e.g.
best,
andre
that is correct as far as playback of a 'perfect' recording is concerned.On 2004-08-12 05:36, cleanbluesky wrote:
...although eliminating jitter by my understanding would retain quality rather than imporve quality?...
But there's (frequently) also an AD conversion of several input channels.
In that context a more stable clock will improve the input signal, too.
Btw the original designer of that diagram (much nicer than my version - it also contained the distorted waveform) was very critical at higher clockrates because the stability of the master clock needed a tremendous amount of technology to achieve the desired quality.
I agree with GaryB that the samplerate conversion is a delicate process and it's very likely to eat up the improvement of the first stage.
On Scope the current design of the boards is highly timing critical due to the free distribution of DSP resources.
Imho that's the reason why many 96k processes put such a high load on the DSPs and just don't run smooth. To be phase accurate there's not enough time for the signal path on the circuit boards.
Yamaha's MLAN design is very interesting in this context. They (have to !) use custom routing chips with stunning data rates - you can't do this in software anymore.
Nevertheless the audio performance of various op amps and converters is a fascinating theme.
People still spent small fortunes on those famous (selected) 16 bit converters found in high end CD players - and I have to admit that I like my original Pulsar's 20 bit types very much, even the 18 bit types of the old A16

But I totally agree that one should try to reach the best quality possible - I'd really like to try out such a quality clock, but it's a bit beyond my budget. Too bad there's no such shop in my area...
cheers, Tom
-
- Posts: 162
- Joined: Sat Jul 31, 2004 4:00 pm
- Location: England
AndreD you may be onto something there! From your post I assume you use the FX-Teleport? So i can link two computer with LAN and use it like that? What sort of performance are you getting (if you use it)?
I was thinking about buying a few tiny PC cases and sticking them under my desk (with LAN connections, processors, 256 - 512 MB RAM and less than 20gig HD - I may have to use a laptop HD with such a setup, do you think this slower harddrive access rate would effect the system a lot? Should I use external 3.5"?)
As far as the 96k vs is concerned, I don't think that 96 thousand cycles per second is a lot for a computer to syncronise - I am not fully versed with the technology behind this but my argument is based on what I read about digital recordings of ultra-sonic bat frequencies. They use custom setups that include Data Aquisition Cards (DAC) which run somewhere near DSD frequency to take millions of readings a second.
"more likely, either way sounds fine, so why waste precious processing power, bandwidth and storage for little to no gain? just to brag to friends?
just my opinion based on facts, not a personal attack...."
Nice disclaimer at the end there GaryB, but I can understand why you might be irate in a discussion such as this, some sample rate threads descend into arguing over unimportance (although everyone who has replied in this thread has been super helpful to me).
I like to record at 96k because if I record something I like, I would not like to think that it could have been better just because of sample rate issues. Or any other issues for that matter. 44.1k is great. So is 96k. I would also like a shot at 384hz/64bit (he he), which I am sure would sound great.
I agree that there may be issues when downsampling back to 44.1k. Or maybe the averaging process improves the quality... That may be something for me to think about when making future recordings. But I still find my computer lagging in the processor department and I don't think that even a processor upgrade would be enough juice alone... not if I wanted heavy processing on multiple tracks. My drumkit from hell superior adds 8-10 alone, so processing on all those in 96k is unrealistic. Not without more processor help
I was thinking about buying a few tiny PC cases and sticking them under my desk (with LAN connections, processors, 256 - 512 MB RAM and less than 20gig HD - I may have to use a laptop HD with such a setup, do you think this slower harddrive access rate would effect the system a lot? Should I use external 3.5"?)
As far as the 96k vs is concerned, I don't think that 96 thousand cycles per second is a lot for a computer to syncronise - I am not fully versed with the technology behind this but my argument is based on what I read about digital recordings of ultra-sonic bat frequencies. They use custom setups that include Data Aquisition Cards (DAC) which run somewhere near DSD frequency to take millions of readings a second.
"more likely, either way sounds fine, so why waste precious processing power, bandwidth and storage for little to no gain? just to brag to friends?
just my opinion based on facts, not a personal attack...."
Nice disclaimer at the end there GaryB, but I can understand why you might be irate in a discussion such as this, some sample rate threads descend into arguing over unimportance (although everyone who has replied in this thread has been super helpful to me).
I like to record at 96k because if I record something I like, I would not like to think that it could have been better just because of sample rate issues. Or any other issues for that matter. 44.1k is great. So is 96k. I would also like a shot at 384hz/64bit (he he), which I am sure would sound great.
I agree that there may be issues when downsampling back to 44.1k. Or maybe the averaging process improves the quality... That may be something for me to think about when making future recordings. But I still find my computer lagging in the processor department and I don't think that even a processor upgrade would be enough juice alone... not if I wanted heavy processing on multiple tracks. My drumkit from hell superior adds 8-10 alone, so processing on all those in 96k is unrealistic. Not without more processor help
your mighty 3 Gig CPU hasn't a chance in hell to catch up with what's going on in the circuitry of a Scope board - clocked at 60 MHZOn 2004-08-12 16:56, cleanbluesky wrote:
...
As far as the 96k vs is concerned, I don't think that 96 thousand cycles per second is a lot for a computer to syncronise - I am not fully versed with the technology behind this but my argument is based on what I read about digital recordings of ultra-sonic bat frequencies. They use custom setups that include Data Aquisition Cards (DAC) which run somewhere near DSD frequency to take millions of readings a second.
...

... and I assume they didn't record a swarm of bats on a single track each

seriously, a general purpose CPU can shift some registers pretty fast, but even the BEST memory interface technology available isn't capable to catch up with the load inside Scope.
The SFP software is nothing but a remote controller, a kind of map to symbolize data flow. The execution itself happens ONLY on the board.
The internal architecture of the DSPs is much more sophisticated in dataflow - don't be irritated by the big numbers of the CPU manufacturers.
There must be a reason that almost every quality 'soundcard' contains a DSP for routing meanwhile

I could give you a drastic off topic example of an experience from yestday.
I had to install an Oracle database in a 'vintage' environement (due to some trouble) and the most easy way was a Windows NT emulation (!) on a G4 Mac at 400 MHZ.
Later I repeated the exact same process on a P4 at 2.6 Gig. Let's assume it was 3-4 times faster, but definetely not more.
I was really shocked how long it took - expecting 10 minutes but ending at half an hour or so

It's really hard to believe the sh*t that's programmed today - and if CWA's installer hasn't the best reputation ... the Oracle crap beats it hands down.
Nice to be able to say this after 2 wasted days - and that I succeed due to the fact that I ignored all kinds of errors and kept processing - finally saving my 40k records from a file that couldn't even be stored and reloaded without crashing the machine

cheers, Tom
-
- Posts: 162
- Joined: Sat Jul 31, 2004 4:00 pm
- Location: England
This sounds interesting... is it because the Scope does a specific kind of processing versus a general one by a P4?On 2004-08-12 21:34, astroman wrote:
your mighty 3 Gig CPU hasn't a chance in hell to catch up with what's going on in the circuitry of a Scope board - clocked at 60 MHZ
My point was that much vaster amounts of data can be shifted than 96k. Why is it so hard for to shift 96k audio, I don't argue that the SHARCs are advanced and that the zero-latency mixing comes in handy. I was just wondering why you said that a few channels of 96k is a lot to shift...On 2004-08-12 21:34, astroman wrote:
...The internal architecture of the DSPs is much more sophisticated in dataflow - don't be irritated by the big numbers of the CPU manufacturers...
So does my Audigy 2On 2004-08-12 21:34, astroman wrote:
...There must be a reason that almost every quality 'soundcard' contains a DSP for routing meanwhile...

it's an overstressed example by me (there are not many such detailed informations available) and refers to a company's comparison of a DSP and a regular CPU for a certain performance test.
They picked the TigerSharc ADSP-TS101S and the PowerPC MPC7410 because they had a similiar estimated floating point performance (26000 FFTs per second) - don't be irritated by the specific chips, the test was designed to be generalized easily.
The DSP yielded a real world result of 22000 at a 250 MHZ clock rate, The G4 8000 at 500 MHZ.
The test programs were handcoded in machine language with all optimizations for each chip - something you'll NEVER find in general purpose programming, so a regular CPU in a regular app will perform at least 5 times worse.
Details are here
their bottom line:
The benchmark implementations and testing supported
the conclusions from Part I that the Tiger-
SHARC is a superior real-time signal processor. In
fact, the results make them even more emphatic,
such that they bear repeating:
“If the application requires a lot of number crunching
with little data movement, typical of so-called
back-end data processing, then the PowerPC’s
higher clock rate and more powerful core will
probably be more effective. For continuous realtime
signal processing such as imaging, radar,
sonar, sigint, and other applications that require
high data flow or throughput, however, the Tiger-
SHARC can dramatically outperform the PowerPC
and is probably the preferred choice.”
Clearly these processors are designed and optimized
for different applications. In the case of the
TigerSHARC, virtually all data movement is done
in background and the performance is driven by
algorithmic speed; in the case of the PowerPC, literally
all of the processing is done in background for
this application, and the performance was driven
by I/O and cache overhead.
cheers, Tom
ps: what's that big chip in the middle of your Audigy
Of course Creative can afford a customer design while others have to rely on general purpose stuff...
<font size=-1>[ This Message was edited by: astroman on 2004-08-13 11:01 ]</font>
They picked the TigerSharc ADSP-TS101S and the PowerPC MPC7410 because they had a similiar estimated floating point performance (26000 FFTs per second) - don't be irritated by the specific chips, the test was designed to be generalized easily.
The DSP yielded a real world result of 22000 at a 250 MHZ clock rate, The G4 8000 at 500 MHZ.
The test programs were handcoded in machine language with all optimizations for each chip - something you'll NEVER find in general purpose programming, so a regular CPU in a regular app will perform at least 5 times worse.
Details are here
their bottom line:
The benchmark implementations and testing supported
the conclusions from Part I that the Tiger-
SHARC is a superior real-time signal processor. In
fact, the results make them even more emphatic,
such that they bear repeating:
“If the application requires a lot of number crunching
with little data movement, typical of so-called
back-end data processing, then the PowerPC’s
higher clock rate and more powerful core will
probably be more effective. For continuous realtime
signal processing such as imaging, radar,
sonar, sigint, and other applications that require
high data flow or throughput, however, the Tiger-
SHARC can dramatically outperform the PowerPC
and is probably the preferred choice.”
Clearly these processors are designed and optimized
for different applications. In the case of the
TigerSHARC, virtually all data movement is done
in background and the performance is driven by
algorithmic speed; in the case of the PowerPC, literally
all of the processing is done in background for
this application, and the performance was driven
by I/O and cache overhead.
cheers, Tom
ps: what's that big chip in the middle of your Audigy

Of course Creative can afford a customer design while others have to rely on general purpose stuff...

<font size=-1>[ This Message was edited by: astroman on 2004-08-13 11:01 ]</font>