Audio library for FreeBasic - Features

General discussion for topics related to the FreeBASIC project or its community.
Post Reply
angros47
Posts: 2321
Joined: Jun 21, 2005 19:04

Re: Audio library for FreeBasic - Features

Post by angros47 »

I used gedit in linux. I assume you are using windows. Try opening the files in wordpad and save them again, to see if it solves the issue
grindstone
Posts: 862
Joined: May 05, 2015 5:35
Location: Germany

Re: Audio library for FreeBasic - Features

Post by grindstone »

You're right, I'm using Windows. I know that Wordpad trick, but it doesn't work if there aren't any line breaks at all (no Chr(10), not even a space).

Maybe I could contribute some routines for amplifying, normalizing and converting .WAVs (pcm and float, 16 and 32 bit), but I'd have to recode them using your naming conventions and accessing buffers instead of files.

BTW, I agree with the idea of making the sound library behave similar to the graphic library.
angros47
Posts: 2321
Joined: Jun 21, 2005 19:04

Re: Audio library for FreeBasic - Features

Post by angros47 »

I hope this could help:
https://superuser.com/questions/757074/ ... text-files

Your routines could surely help: internally, in my library, WAV data are stored at 8 or 16 bit formats, mono or stereo, pcm uncompressed, with a fixed size header. So, in theory, working on those buffers should be easier than working on files.
Juergen Kuehlwein
Posts: 284
Joined: Mar 07, 2018 13:59
Location: Germany

Re: Audio library for FreeBasic - Features

Post by Juergen Kuehlwein »

@grindstone,

you need an editior, which can deal with LF (0x0A) only as linebreaks. In Windows the standard is CR+LF (0x0D + 0x0A). Obviously your editior doesn´t accept LF only as linebreak (or maybe it just isn´t set to do so). Or write a little conversion utility replacing LF with CR+LF in angros47´s sources.


JK
grindstone
Posts: 862
Joined: May 05, 2015 5:35
Location: Germany

Re: Audio library for FreeBasic - Features

Post by grindstone »

I don't know what happened, but today the Wordpad trick works. Very strange...
angros47 wrote:internally, in my library, WAV data are stored at 8 or 16 bit formats, mono or stereo, pcm uncompressed, with a fixed size header.
As internal format I would strongly recommend 32bit float, just because this format provides the least loss of quality at processing. One simple example: Assumed you'd attenuate an 8-bit-pcm (music-) file with a volume level of 0 dB to a volume level of -36dB (1/64), only the 2 most significant bits will survive this process. If you amplify this piece of sound to 0dB again you couldn't even recognize what song it was. If you convert this sound file to 32bit float, attenuate/amplify it and reconvert it to 8-bit-pcm again, there will be no (perceptible) loss of quality.
So, in theory, working on those buffers should be easier than working on files.
Of course I use a buffer to compute the new volume of my files, but I (mis)use a string variable for this purpose. The advantage is that I don't have to care about memory management, FB does this all for me.

For now, two functions that don't need to be recoded:

Code: Select all

Function db (ra As Single) As Single 'converts ratio to dB
	
	Return 20*(Log(ra)/Log(10))
	
End Function

Function ra (db As Single) As Single 'converts dB to ratio
	
	Return Exp((db / 20) * Log(10))
	
End Function
angros47
Posts: 2321
Joined: Jun 21, 2005 19:04

Re: Audio library for FreeBasic - Features

Post by angros47 »

I perfectly understand the superiority of floating point formats over fixed point formats. The main drawback of digital informations, compared to analogue recording, is that intermediate values are always rounded to the closest allowed value (and this is an issue affecting all digital information, sound, images, voltage measurements and more). Using floating point values removes that issue, so it might be the ideal way to store digital informations (quantization errors and artefacts can be reduced or eliminated, only sampling artefacts remain)

Still, there are two issues: the first one is that not every platform supports that format: surely DOS doesn't (neither the sound blaster, nor the emulated one provided by Windows Sound System do), and some versions of OSS on linux don't, as well. While 8 bit PCM is supported everywhere, and 16 bit PCM almost everywhere (since it's the format used by CD)

The second issue is that a large number (if not most) of WAV files are in 8 or 16 bit format: and, although storing them in floating point format would cause no perceptible loss of quality, if you read them, and save them again there still will be some rounding artefact. This won't happen if you keep them in their original format and convert the data you want to add to them in their own format

The routines that create new waveforms, internally, use the floating point format: even the filters, and the modulators do. If you want to apply them to a wave file, the wave data are converted in floating point format, to allow a more precise editing. But that is the only case when floating point format is used for the wave data.
grindstone
Posts: 862
Joined: May 05, 2015 5:35
Location: Germany

Re: Audio library for FreeBasic - Features

Post by grindstone »

Your thoughts make sense, I agree.

Here some routines for amplification and normalizing with a little example, at the moment 32bit float only:

Code: Select all

Declare Function db (x As Single) As Single
Declare Function ra (x As Single) As Single
Declare Sub ampfloat (buffer As Single Ptr, bufsize As ULong, amplification As Single)
Declare Function getlevel(buffer As Single Ptr, bufsize As ULong) As Single
Declare Sub setLevel(buffer As Single Ptr, bufsize As ULong, level As single)

Dim As String source = "C:\My32bitFloatInputFile.wav" 'source file
Dim As String dest = "C:\My32bitFloatOutputFile.wav.wav" 'destination file
Dim As String g
Dim As Integer header, bytesread

Open source For Binary Access Read As #1
Open dest For Binary As #2

Do 'get header size
  header += 1
  Seek #1, header
  g = Input(4, #1)
Loop Until g = "data"
header += 7

'transfer header
Seek 1,1
Print #2, Input(header, #1);

Dim As Single Ptr pb = Callocate(Lof(1)) 'create buffer
Get #1,,*pb, (Lof(1) - header) / 4, bytesread 'write source file data to buffer
Print "Old level "; getlevel(pb, bytesread); "dB"
setLevel(pb, bytesread, -20) 'set volume level to -20 dB (true RMS)
Put #2,,*pb, bytesread/SizeOf(Single) 'write audio data to destination file
Print "New level "; getlevel(pb, bytesread); "dB"

DeAllocate pb
Close

Sleep

Sub ampfloat (buffer As Single Ptr, bufsize As ULong, amplification As Single)
  'amplifies the audio data by the stated factor
        
	For gp As Single Ptr = buffer To buffer + Cast(ULong, bufsize / SizeOf(Single))
		*gp *= amplification
	Next 
  
End Sub

Sub setLevel(buffer As Single Ptr, bufsize As ULong, level As Single)
	'sets the volume level of the audio data to the stated value (in dB, true RMS) 
	Dim As Single amp
	
	amp = ra(level - getlevel(buffer, bufsize))
	ampfloat(buffer, bufsize, amp)
	
End Sub

Function getlevel(buffer As Single Ptr, bufsize As ULong) As Single
	'calculates the volume level of the audio data in dB (true RMS)
	Dim As Single totval

	For gp As Single Ptr = buffer To buffer + Cast(ULong, bufsize / SizeOf(Single))
		totval += Abs(*gp) * Abs(*gp)
	Next
	Return db(Sqr(totval / (bufsize / SizeOf(Single))))

End Function

Function db (x As Single) As Single 'converts ratio to dB
	
	Return 20*(Log(x)/Log(10))
	
End Function

Function ra (x As Single) As Single 'converts dB to ratio
	
	Return Exp((x / 20) * Log(10))
	
End Function

coderJeff
Site Admin
Posts: 4313
Joined: Nov 04, 2005 14:23
Location: Ontario, Canada
Contact:

Re: Audio library for FreeBasic - Features

Post by coderJeff »

angros47 wrote:Designing an API is likely harder than implementing it, since any future change will break a lot of code. ... I think that the sound library should behave like the graphic library, if possible: either they both should go under the main namespace, or they both should have dedicated namespaces.
Yes, the API should be given some careful thought, since once it is added, it's probably going to be there for a long time.

OK, let's put the general naming/#include/namespace issue to the side for the moment, while it is important, it is only a small part of designing the API, and instead let's examine the development path of the fbgfx libary/API. (I will followup in a second post for the naming issues).

1) What does the fbgfx library actually do? How does it work to provide graphics?
a) graphics library allows for one display context/object per application.
b) When initialized, it creates a graphics object/context, connections to system resources: a window, timers, events, threads, connects a suitable display/keyboard/mouse driver, intializes colour formats, palette, blitters, image format, and initializes data (properties, like current colour, scaling, and clipping).
c) When a graphics function (method) is invoked, the request is dispatched to the gfx context/object to either update gfx context state, or invoke an immediate result

2) The fbgfx library has 2 API's. An internal API for the library itself, and an external API that the user can access. And the 2 API's are not the same. Not even slightly. Even though the user's API exposes many parts of the internal graphics object/context, the user's API is often wrapped with extra code for safety; checks invalid values, returns error codes, etc.

3) The internal API supports the implementation of the library itself. When a graphics method is called (e.g. PSET, LINE, PUT, etc). The method implementation calls internal routines to set the rendering context, colours, clipping, selects a blitter for image buffers, etc. Internally, the library has a number of functions it must call, in a specific order. The internal API is for the library itself only. It doesn't exist outside the library.

4) The external public API gives access to the graphics context/object to the user.
a) construct and destruct graphics mode (SCREEN, SCREENRES)
b) methods that invoke rendering results (PSET, LINE, PUT, DRAW, etc)
c) methods that change/read properties (COLOR, VIEW, WINDOW, SCREENCONTROL, etc)
d) the public API should be well documented and robust

5) The design of the public fbgfx API evolved over several years.
a) Initially, very little design as we were just copying what was already in QB. (Leaving us with the "<gfx_statement> unique syntax" that can only be expressed in the compiler's parser. I think I will expand on this in another related thread on the forum).
b) initially, the existing QB API influenced the design of the library implementation
c) later, new features added to the library influenced the design of the API
d) The was API expanded over time as new features were added, or various parts of the libraries internals were exposed (SCREENEVENT). d
e) Some parts of the API are formalized only by including fbgfx.bi

The reason I try to lay this all out like this is because of the several comments that sound API should be just like graphics API. For me, that leaves too much open for interpretation.
coderJeff
Site Admin
Posts: 4313
Joined: Nov 04, 2005 14:23
Location: Ontario, Canada
Contact:

Re: Audio library for FreeBasic - Features

Post by coderJeff »

angros47 wrote:For that reason, I think that the sound library should behave like the graphic library, if possible: either they both should go under the main namespace, or they both should have dedicated namespaces.
So coming back to the naming issue. The immediate advantage of #include + namespace is that it has least impact on all existing user code. The long term advantage is that this approach would be readily compatible with proposals in FreeBASIC Namespace Project. And there is still the possibility to include some or all sound "keywords" later by default if justified.

It's a challenge, because every not user/developer/contributor will agree with each other. However, developers will often agree between themselves, usually after discussion and possibly a justifiable compromise.
angros47
Posts: 2321
Joined: Jun 21, 2005 19:04

Re: Audio library for FreeBasic - Features

Post by angros47 »

From my point of view, I agree that the #include solution would be the easiest one to implement (at the moment, after all, my library works in that way). My doubts against that solution originate from the fact that such a solution is more C-like than Basic-like. Basic (not just FreeBasic, but also Visual Basic, Quick Basic, BlitzBasic, RapidQ...) usually provides the general features directly in its core language. C, on the other end, offers nothing in its core language, not even PRINT, everything has to be done with libraries. Both solutions are perfectly valid, just different. In my personal opinion, both Basic and C should stay consistent, that is why I am not sure if applying a solution more suitable for C is a good idea or not.

In FreeBasic, external libraries are used for GUI, for example (there are examples for windows GUI, for GTK, for FLTK, for wx-c...): as result, there is no default solution, a newbie has no idea of what GUI they should use. It's the same issue of Linux, basically.... fragmentation. Regarding graphic, instead, no one has the slightest doubt about what to use, since the internal library is the most obvious choice.

My worry is: with an include based approach, would people be able to figure how to use audio, or would they struggle to decide between fmod, openal, fbsound, and so on?

If you think my worries are unjustified, I would be fine with the "#include" solution
coderJeff
Site Admin
Posts: 4313
Joined: Nov 04, 2005 14:23
Location: Ontario, Canada
Contact:

Re: Audio library for FreeBasic - Features

Post by coderJeff »

I think you raise some valid concerns.

I would prefer to see the initial addition as an #include only as it is the most trouble free for everyone right now and would allow development to continue. And by doing this, it pretty much guarantees that the API can be 100% expressed naturally in fbc language itself.

Also, I would like to at least retain the option for a namespace as I think the major complaint from the advanced users is that fbc's many keywords just get in the way or are irrelevant for what they want to express in their code. Not just adding new keywords, but existing keywords too.

I understand, for beginners, having something built-in helps. What's the minimal API required? Does everything need to be added by default?
badidea
Posts: 2586
Joined: May 24, 2007 22:10
Location: The Netherlands

Re: Audio library for FreeBasic - Features

Post by badidea »

I don't think that the #include solution will be a problem for most users, as long as the library is shipped with freebasic and works out-of-the-box.

With 'non-native' libraries, one has to figure out:
- where to download
- which version
- is it up to date
- where to put the library
- how to link it
Stuff that most people don't want to know when trying to get something to work.
Juergen Kuehlwein
Posts: 284
Joined: Mar 07, 2018 13:59
Location: Germany

Re: Audio library for FreeBasic - Features

Post by Juergen Kuehlwein »

@angros47,

... just to be sure, did you receive my mail?


JK
angros47
Posts: 2321
Joined: Jun 21, 2005 19:04

Re: Audio library for FreeBasic - Features

Post by angros47 »

Whoops, I missed it, checking it now
ShawnLG
Posts: 142
Joined: Dec 25, 2008 20:21

Re: Audio library for FreeBasic - Features

Post by ShawnLG »

I think FreeBASIC having its own sound library would be a great addition to it's included API's. The API would be useless if it does not mix and play multiple sound samples at the same time. Also having the ability for real time 3D positioning adjustments for game sound effects would be great. Doing this is not hard. I have written a Sound API in QB4.5 decades ago using the future library for extended memory support for storing the sound samples. I have used lookup tables for the non linear mixing calculations. It is non linear because mixing several sound samples together is an additive function. This can cause saturation. So i used a compressor function to soften the clipping without hindering the volume of the mixed audio.

How does it works, easy part:
When the sound system is initialize. It is playing from two buffers, usually at the highest bit depth and rate of the hardware. This works much like the graphics library using page flipping. One buffer gets played while the other gets updated.
How samples are added together? first add two sixteen bit samples to a higher bit depth such as 32 bits so we do not loose sound info from clipping. Then we run the added samples through a compressor which will allow the compressed sample to be converted back to 16 bits without clipping. The volume of the samples is also calculated during the mixing process.

The hard part:
Compressing the audio is in fact distorting it and audiophiles do not like that. I would recommend using a variable which the user can adjust the amount of compression is used from additive to logarithmic. I would still use lookup tables for calculation. I mixed several samples together in real time in interpreted QB while on slower hardware. Floating point is not necessary, conversion between int and float is expensive.

If sound hardware today have a 24bit or higher DACs in them. You should be able to mix and play audio without compression. I am not familiar with today sound hardware. It looks like it has not changed in the last twenty years. I remember my sound blaster live! card having more features than the stuff today.
Post Reply