#Define vs Const

New to FreeBASIC? Post your questions here.
Post Reply
StillLearning
Posts: 54
Joined: Aug 27, 2019 22:22

#Define vs Const

Post by StillLearning »

I want to have a easy way to isolate bits of a byte.
I tried using #define to name all the bits and the using a #define for a mask. This works.
The other way is to use a Const for each bit of the byte. It also works.
Does a #Define use more or less memory than a Const?
Is a #Define faster than a Const?
Which way is better?

Example: using #Define

Code: Select all

'using #define
#Define Bit1  1
#Define Bit2  2
#Define Bit3  4 
#Define Bit4  8
#Define Bit5  16
#Define Bit6  32
#Define Bit7  64
#Define Bit8 128
'other bytes currently not being used but can easily be expanded in the future

'create a mask. Others can easily be made.
#Define ComboBits  Bit1+Bit2+Bit3

Dim ColorByte  as UInteger

ColorByte = (ColorByte and ComboBits) 'isolate the first 3 bits.

end
Example using Const

Code: Select all

'using Const
Const Bit1  1
Const Bit2  2
Const Bit3  4 
Const Bit4  8
Const Bit5  16
Const Bit6  32
Const Bit7  64
Const Bit8 128
'other bytes currently not being used but can easily be expanded in the future

'create a mask. Others can easily be made.
Const ComboBits Bit1+Bit2+Bit3

Dim ColorByte  as UInteger

ColorByte = (ColorByte and ComboBits) 'isolate the first 3 bits.

end
Is there a better way that easily allows changes without affecting the existing bits in the future?
bcohio2001
Posts: 556
Joined: Mar 10, 2007 15:44
Location: Ohio, USA
Contact:

Re: #Define vs Const

Post by bcohio2001 »

The "Define" does a text replacement in the code before compilation. While "Const" creates a variable.

It's a matter of coding style of which you prefer.
D.J.Peters
Posts: 8586
Joined: May 28, 2005 3:28
Contact:

Re: #Define vs Const

Post by D.J.Peters »

With const you can define the right datatype also !

' a 8 bit byte mask
const as ubyte mask8 = &HFF
' a 16 bit word mask
const as ushort mask16 = &HFFFFFF
...
' 8 bit masks
const as ubyte bit0 = 1 shl 0
const as ubyte bit1 = 1 shl 1
const as ubyte bit2 = 1 shl 2
const as ubyte bit3 = 1 shl 3
...

Joshy
counting_pine
Site Admin
Posts: 6323
Joined: Jul 05, 2005 17:32
Location: Manchester, Lancs

Re: #Define vs Const

Post by counting_pine »

#define uses simple text replacement, which can have some weird effects because they don't respect operator precedence.

Code: Select all

const as ubyte bit1 = 1, bit2 = 2, bit3 = 4

const ComboBits       = bit1+bit2+bit3 ' = 7
#define ComboBitsDefine bit1+bit2+bit3

print ComboBits*3       ' = 21
print ComboBitsDefine*3 ' = 15?
I can't think of any good reason to use #define for constants, but I can think of several bad ones. So I'd recommend Const where possible.
dodicat
Posts: 7976
Joined: Jan 10, 2006 20:30
Location: Scotland

Re: #Define vs Const

Post by dodicat »

There is something wrong with 64 bit gcc.

Code: Select all

const four=sin(.5)^2+cos(.5)^2  + sin(.5)^2+cos(.5)^2 +sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2
#define _four sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2
dim shared as double four_=sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2



sub fbmain
dim as long x
for n as long=1 to 100000000
    rnd  'warm up
next


for k as long=1 to 5
    x=0
dim as double t=timer
for n as long=1 to 100000000
    x+=four
    next n
    print timer-t,x,"const"
 
 x=0
 t=timer
    
   for n as long=1 to 100000000
    x+=_four
    next n
print timer-t,x ,"define"

x=0
 t=timer
    
   for n as long=1 to 100000000
    x+=four_
    next n
print timer-t,x,"variable" 
    
 x=0
 t=timer
    
   for n as long=1 to 100000000
    x+=sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2
    next n
print timer-t,x,"raw"
    print   
    
next k
print "Done"
end sub

fbmain

sleep 
Whether or not the looping is inside a sub or on the main program, 64 bit gcc takes too long for the variable four_
32 bit gas/gcc and gas64 all OK.
marcov
Posts: 3455
Joined: Jun 16, 2005 9:45
Location: Netherlands
Contact:

Re: #Define vs Const

Post by marcov »

Note a possible difference might also be the point where the type is fixated:

For const this can be by the parser at declaration (the whole const line), for untyped constants when just the literal is parsed. Either way, one declaration is usually the same for all.

#defines however depends on the context of usage, which can be both a blessing and a curse.
hhr
Posts: 206
Joined: Nov 29, 2019 10:41

Re: #Define vs Const

Post by hhr »

@Dodicat
Can it be the conversion Long-Double?
dodicat
Posts: 7976
Joined: Jan 10, 2006 20:30
Location: Scotland

Re: #Define vs Const

Post by dodicat »

Hi hhr.
Yes it is the casting to long, it is too slow with gcc.
deltarho[1859]
Posts: 4292
Joined: Jan 02, 2017 0:34
Location: UK
Contact:

Re: #Define vs Const

Post by deltarho[1859] »

It is worth looking at: Symbolic Constants

I tried posting this a few days ago but the link wasn't working.
Post Reply