4

I am using STM32F072C8T6 microcontroller with HAL library. I write a program to send out an analog voltage through the DAC pin of the microcontroller but it does not work. I ran the debugger, and I could see that none of the DAC registers changed when I stepped through the code. Does anyone know if I miss something in the code?

I take over the project from other. He generated the project configuration from CubeMX. However, I don't have the project .ioc file (CubeMX file), so I have to add the DAC functions manually instead of using CubeMX. What I did is I uncommented the #define HAL_DAC_MODULE_ENABLED in stm32f0xx_hal_conf.h, and add the stm32f0xx_hal_dac.c and stm32f0xx_hal_dac_ex.c into the Drivers folder.

Here are the code for the DAC in main.c:

DAC_HandleTypeDef hdac;
int main(void){
  HAL_Init();
  SystemClock_Config();
  DAC_ChannelConfTypeDef sConfig = {0};
  hdac.Instance = DAC;
  if (HAL_DAC_Init(&hdac) != HAL_OK)
  {
    Error_Handler();
  }
  sConfig.DAC_Trigger = DAC_TRIGGER_NONE;
  sConfig.DAC_OutputBuffer = DAC_OUTPUTBUFFER_ENABLE;
  if (HAL_DAC_ConfigChannel(&hdac, &sConfig, DAC_CHANNEL_1) != HAL_OK)
  {
    Error_Handler();
  }
  HAL_DAC_Start( &hdac, DAC_CHANNEL_1);
  HAL_DAC_SetValue( &hdac, DAC_CHANNEL_1, DAC_ALIGN_12B_R, 2048);
  while(1){
  }
}

The DAC output should be 1/2*3.3V = 1.65V. However the actual voltage is 0V, and all the DAC registers remain 0x00. I have also tried to create a new project with CubeMX, and the DAC works perfectly with this new project so the hardware is not the problem.

cuckoo
  • 111
  • 2
  • 9
  • Are you sure the DAC functions are being correctly linked in? If your debugger supports stepping through code, are you sure execution is getting all the way through the loop? Can you step *into* the HAL DAC functions correctly or does the debugger get confused? What is the expected behavior of `Error_Handler()` when there is a failure and do you see that behavior? – skrrgwasme Sep 06 '19 at 19:38
  • 2
    I haven't worked with these MCs or CubeMX before, but MC build systems can be very picky about how they build and link code. I'd be suspicious of your manual work just to get it to compile. Also, consider opening embedded questions over at [Electrical Engineering Stack Exchange](https://electronics.stackexchange.com/) instead. This is on topic here, but they do a lot of lower-level programming over there. If you move this question, make sure you delete this one here on SO so it's not duplicated on multiple sites within the SE network. – skrrgwasme Sep 06 '19 at 19:41
  • @skrrgwasme: I can step into the HAL DAC functions correctly. I even tried to modify the DAC registers directly in main file but it doesn't change. – cuckoo Sep 06 '19 at 19:54
  • @cuckoo Which IDE you work on?, i think `HAL_DAC_SetValue( &hdac, DAC_CHANNEL_1, DAC_ALIGN_12B_R, 2048);` concentrate on its arguments are wrong, make an project with CubeMX software and use DAC module then see how to correctly initiate and change DAC value. – EsmaeelE Sep 06 '19 at 20:01
  • http://www.openstm32.org/forumthread2014 – EsmaeelE Sep 06 '19 at 20:17

1 Answers1

3

Enable the clock for the DAC in the RCC. Exactly which bit of which register should be set is documented in the reset and clock control chapter of the reference manual for your microcontroller.

As long as the clock of a peripheral is not enabled, all of its registers read 0, and the peripheral is not usable.

  • Every ARM-based microcontroller I've used has included clock-switching logic which disables the clock signals to various modules until they are explicitly enabled, so as to minimize power consumption by modules that aren't used. In some micros, attempting to access registers for a module that is powered down may trigger a hard fault interrupt (whose defaults handler is a branch-to-self instruction), while in controllers, writes will be ignored while reads yield zero. I've often wondered how hard it would be to have a controller which would cause accesses to insert a wait state while... – supercat Sep 08 '19 at 18:18
  • ...running the clock just long enough to perform the access. If e.g. the system clock is running at 16MHz, a peripheral will be accessed 4,000x/second, it won't need a clock except while it's being accessed, and each access powers on the clock for four cycles, running the peripherals clock for 16,000 cycles each second would take a tiny fraction of the power of leaving it on constantly. I've never seen a peripheral do that, however. – supercat Sep 08 '19 at 18:21
  • @supercat Peripherals have to be clocked in order to do their work, not only for accessing their registers. A GPIO port would sort of work, sampling the pins "just in time" when the data register is accessed, but it could no longer trigger an interrupt (every GPIO pin can be configured to detect edges and trigger an interrupt on STM32). It would make no sense at all on communication ports which shift data in and out. – followed Monica to Codidact Sep 09 '19 at 09:53
  • Most peripherals will, at any given time, either know that they have something they need to do, or that they will have something they need to do if inputs are in particular states. If each peripheral fed a synchronous and asynchronous signal to the clock distribution module indicating whether it had anything interesting to do, the clock module could pass the async signal through a double synchronizer that was unconditionally active (but which, being just two registers located in the clock module, wouldn't consume much power), and only feed the rest of the registers in the module... – supercat Sep 09 '19 at 14:50
  • ...if there was something interesting for them to do. The transmit side of a UART could thus ensure that a UART remained clocked until the last bit of data was transmitted, and the receive side could ensure that it would be clocked if either the input state differed from expectation or if it was in the process of validating a start bit or receiving a byte, but when none of those things was happening, only two UART-related registers in the system (the synchronizer in the clock module) would need to be clocked, which should be much cheaper than clocking the whole UART. – supercat Sep 09 '19 at 14:54
  • Having the UART clocked in advance of an input stimulus would make it possible for it to respond more quickly by eliminating the double-synchronizer startup delay, but for the many applications where the delay would pose no problem and where a UART would mostly sit idle, having the clocks only run when needed would be more useful. – supercat Sep 09 '19 at 14:57