Quantcast
Channel: Infineon Forums
Viewing all 9892 articles
Browse latest View live

[SPI DMA] Missing first byte

$
0
0
Dear Infineon,

I'm currently working on a XMC4700-F100F2048 configured in DAVE with an SPI_MASTER app and a DIGITAL_IO to manage the SPI_CS.

SPI_MASTER app is configured in DMA mode both in transmit and in receive and runs at 1Mb/s, Full-duplex.

My firmware has both a bootloader and an application, developed in two different DAVE projects. SPI works correctly in bootloader, but when the
application starts the first SPI transaction doesn't send the first byte.

To emulate this behavior i developed a test project with the SPI configured as follows:

Attachment 3684

Attachment 3685

Here it is the main.c code:

Code:

#include <DAVE.h>                //Declarations from DAVE Code Generation (includes SFR declaration)


static uint8_t buffer[3] = { 0x01, 0x02, 0x03 };

void SPI_Send( uint8_t *buffer, uint8_t length )
{
        DIGITAL_IO_SetOutputLow( &SPI_SS );
        SPI_MASTER_Transmit( &SPI_MASTER, buffer, 3 );
        while( SPI_MASTER_IsTxBusy( &SPI_MASTER ) );
        while( SPI_MASTER_GetFlagStatus(&SPI_MASTER, (uint32_t)XMC_SPI_CH_STATUS_FLAG_MSLS) != 0U );
        DIGITAL_IO_SetOutputHigh( &SPI_SS );
}

int main(void)
{
        DAVE_Init();

        SPI_Send( buffer, 3 );

        SPI_MASTER_Init( &SPI_MASTER );

        SPI_Send( buffer, 3 );

        while(1U);
}

What it is missing?

Thanks in advance for your kindness,

Andrea
?????

XMC1300 GNU GCC Linker HELP

$
0
0
Been stuggling with the for 2 days not. Trying to use the GNU GCC compiler with the XMCLibs sdk. I am using the GCC startup_XMC1300.S along with the XMC1300x0032.ld

using the following gcc command

Code:

arm-none-eabi-gcc -g -Wall -O0 --specs=nano.specs -mthumb -fmessage-length=0 -fsigned-char -ffunction-sections -fdata-sections -mcpu=cortex-m0 -o prog.out -T XMC1300x0032.ld -DUC_FAMILY=XMC1 -DXMC1302_T016x0032 -DPROD [-I INCLUDE FILES] [SOURCE FILES] startup_XMC1300.S
with that I keep receiving

Code:

(.text+0x68): undefined reference to `__bss_start__'
(.text+0x6c): undefined reference to `__bss_end__'

I can use -nostartfiles to get rid of it but then I get
Code:

init.c:(.text.__libc_init_array+0x12): undefined reference to `_init'
Please help! I have been googling for hours!

Questions about the bandgap-reference-circuit specified in the datasheet

$
0
0
Hi,

I am reviewing the datasheet to use the TLE9867QXW.
I have a question about the bandgap-reference-circuit during the ADC review.

1 Can I measure bandgap-reference using a bandgap-reference-circuit on a measurement unit?
1.1 If I can measure the bandgap reference, how is it done?

2. If I have misunderstood the datasheet and asked a question like 1., please answer below if I understand what I understand.
2.1 This module is itself a bandgap-reference-circuit, which produces a reference voltage for an 8-bit ADC. It also creates the reference voltage for the NVM module.

The third is the request.

Request the datasheet detailed above.

Regards,
Rocky

I am looking to use the IRMCK099 in a new design, but I have the following question:

$
0
0
The data sheet answers your question: IRMCK099 contains the flexible Tiny Motion Control Engine (TinyMCE) for sensorless control of permanent magnet motors (which is PMSM) over the full speed range. The drive signals for the PMSM and the BLDCM are different.
The BLDC is driven trapezoidally, while the PMSM is driven sinusoidally. So, the answer is: no, you can't. You can visit the "Motor and Drives" page presenting different solutions for permanent magnet synchronous motor and brushless DC motor.

Hope this helps.
forix

Aurix tc267d

$
0
0
hi,
I'm new to AURIX. I've tried implementing an SPI communication using AURIX as my master.
I'm facing issues that my tasks are not getting called in mentioned time.
There are various tasks in my OSTask.c like ostask_5ms(), ostask_10ms(), ostask_20ms(), etc. and it is expected that they should be called within mentioned time i.e ostask_5ms() in 5 milliseconds .
But it is observed that all these tasks are being called after 42 to 45 sec.
Also there is no function of 42sec or 45 sec.
Can anyone help me out where the issue is?

Configuration of DMA for Transfer vom RAM to EBU Interface on XMC4700

$
0
0
Good Morning Everyone,

currently I try to config the DMA Channel for transfer 240x320 unit16_t values
from a Frame buffer in my RAM to the EBU Interface. But with the current
configuration and implementaion of the GPDMA0_0_IRQHandler the interface
not transfer each 16bit value to my LCD.

But if I use the conventional method with the for loops, my image content is
transferred to the display. yes with other words the EBU interface to the display
is configured correctly.


Has one of you an idea what I need to change about the configuration and or the
interrupt service Routine, so that I can move the data with the DMA interface

Thanks in advanced

Best regards

EbbeSand


Code:


XMC_DMA_CH_CONFIG_t GPDMA0_Ch0_config =
{
  .enable_interrupt = true,
  .src_transfer_width = XMC_DMA_CH_TRANSFER_WIDTH_32,
  .dst_transfer_width = XMC_DMA_CH_TRANSFER_WIDTH_32,
  .src_address_count_mode = XMC_DMA_CH_ADDRESS_COUNT_MODE_INCREMENT,
  .dst_address_count_mode = XMC_DMA_CH_ADDRESS_COUNT_MODE_NO_CHANGE,
  .src_burst_length = XMC_DMA_CH_BURST_LENGTH_1,
  .dst_burst_length = XMC_DMA_CH_BURST_LENGTH_1,
  .enable_src_gather = true,
  .enable_dst_scatter = false,
  .transfer_flow = XMC_DMA_CH_TRANSFER_FLOW_M2M_DMA,
  .src_addr = (uint32_t) &GuiLib_DisplayBuf.Words[0][0],
  .dst_addr =  (uint32_t) &LCD_RAM,
  .src_gather_interval = 0,
  .src_gather_count = 32,
  .dst_scatter_interval = 0,
  .dst_scatter_count = 0,
  .block_size = 10,
  .transfer_type = XMC_DMA_CH_TRANSFER_TYPE_MULTI_BLOCK_SRCADR_RELOAD_DSTADR_CONTIGUOUS,
  .priority = XMC_DMA_CH_PRIORITY_7,
  .src_handshaking = XMC_DMA_CH_SRC_HANDSHAKING_SOFTWARE,
};

void GPDMA_Init(void)
{


  XMC_DMA_Init(XMC_DMA0);
  XMC_DMA_CH_Init(XMC_DMA0, 0, &GPDMA0_Ch0_config);
  XMC_DMA_CH_EnableEvent(XMC_DMA0, 0, XMC_DMA_CH_EVENT_BLOCK_TRANSFER_COMPLETE);
  XMC_DMA_IRQHandler(XMC_DMA0);
  NVIC_SetPriority(GPDMA0_0_IRQn,11);
  NVIC_EnableIRQ(GPDMA0_0_IRQn);

}

void GPDMA0_0_IRQHandler(void)
 {
  uint32_t event;

  event = XMC_DMA_CH_GetEventStatus(XMC_DMA0,0);

  if(event == XMC_DMA_CH_EVENT_BLOCK_TRANSFER_COMPLETE)
  {
 
          XMC_DMA_CH_ClearEventStatus(XMC_DMA0, 0, XMC_DMA_CH_EVENT_BLOCK_TRANSFER_COMPLETE);
          counter++;
     
          if (counter > 240)
          {
                    XMC_DMA_CH_Disable(XMC_DMA0,0);
          }
          else
          {
                  XMC_DMA_CH_SetSourceAddress(XMC_DMA0, 0, (uint32_t) &GuiLib_DisplayBuf.Words[counter][0]);
                  XMC_DMA_CH_Enable(XMC_DMA0,0);
          }
  }
 }

SDMMC app upgrade issues (4.0.22 -> 4.3.22

$
0
0
Thanks for that Jesus, I have removed the LED signal and it works perfectly. I might need to reinable it at some point in the future but by then a new version of the app will be released. I couldn't actually see your attached version of the APP. Thanks,
Nick

AURIX 26X interrupt and timer confuration

$
0
0
Hi,
I am using Infineon aurix 267D for SPI communication with TDA.
I am able to transmit and receive data properly. In run mode I want Aurix to send command after each 100 ms interval but it is taking around 40-50 sec for it.
According to me there may be problem with interrupt or timer configuration but I am unable to find where exactly the error will be.
Can anyone please help me out..?

AURIX TC22xL oscillator

$
0
0
Hi

We are unable to find informations about internal oscillator (RC or whatever) in SAL-TC222L-16F133N in datasheet, is there such thing implemented in that device? If it is, how oscillator source is choosen, I mean we have external crystal one, connected to XTAL1 and XTAL2, which one is default is such case? What should be the amplitude of our external crystal, currently we have like 0.6 Vp-p, is that fine?

Kind Regards,
Piotr

SDMMC app upgrade issues (4.0.22 -> 4.3.22

$
0
0
Thanks for that Jesus, I have removed the LED signal and it works perfectly. I might need to reinable it at some point in the future but by then a new version of the app will be released. I couldn't actually see your attached version of the APP. Thanks,
Nick

TLF35584 INIT to NORMAL MODE

$
0
0
I need to verify this still but I have a similar problem and I think it is down to the differences between different versions of the chip, make sure you have the one you think you have.

SSC Tool, SSC_Device, and structure for category strings

$
0
0
The SSC project settings are cached in the SSC app and it's probably picking up "XMC_ESC" from the default Infineon configuration.
You can delete this from SSC > Tool > Options > Configurations by selecting the configuration named "Infineon XMC EtherCAT hardware" and then clicking the (-) button to delete the cached configuration.
Every time that you import a configuration xml file into SSC it gets cached so it's easy to delete and re-load the default configuration from the "Infineon_XMC_ECAT_SSC_Config.xml" file generated by DaVE.

Tricore TC1782 GPT Pins

$
0
0
Hello there.
I have been trying to read TC1782 on-board and trying to figure out which pins could be GPT S1 and GPT S2 on MCU?
Any help from you guys will be highly appreciated.
It’s Infineon Tricore “K-TC1782N-320 F180HL”
I have got it’s datasheet and pinout description but unable to identify proper GPT pins from them.
Kindly guide me on this...

Thank you.

TC26X Trap4 issue

$
0
0
hello,
when I debug the S/W,the programming will go into trap4 issue,tin number is 2;
it shows the data access synchronous error trap occured.
do you have encountered the similar issue and what's the possible reasons?thank you

On how to use matlab/simulink to simulate EVAL_M1_099M.mdl


Several questions about using iMOTION

$
0
0
First: The IRMCK099 combines the iMOTION™ motor control engine (MCE) with all peripherals required to realize a complete variable speed drive (for permanent magnet motors).
The steps to develop a motor control are MCE(Tiny)Wizard -> MCEDesigner -> MCEProgrammer.
Once you have developped the MCE for one or more (up to 31 Motors) motor characteristics you will do the one time programming (OTP) of the device. You will develop things like ramp up, speed, and ramp down of the motor, Start, Stop etc. either as standalone, configuration or with human interface using for instance the GPIOs. You will check the result by programming the SRAM of the device with the help of MCETOOLV2 (start MCEProgrammer). When finished you can program to OTP. Using the internal SRAM allows an unlimited number of development cycles, while the MCEProgrammer is used to write the firmware and final parameter set into the OTP memory.

Two: For devices having only a Motor Control Engine, without microcontroller, there is only a download firmware binary. No software source code available. Motor parameter sets are downloaded to device as well as mentioned in "First".

Three: Download the installer file from the TAB Tools&Software via the EvalBoards webpage. After installation you will find the circuit diagram in the folder ../documents/iMotion/<Board>/hardware.Four: See your thread

Four: Answered in your thread here.

Five: ??.

Sixth: When measuring the winding resistance between two phase terminals devide the result by two to get the per phase resistance value for both Y or delta connected motors. Measuring the winding inductance you will always read twice the per phase inductance. See TinyWizard(MCEWizard) Questions about the Motor Parameters.

Beginner question: first Board with XMC1100 not recognized by debugger

$
0
0
Hello all,

my Name is Michael. I am just in the progress of replacing some hard-to-come-by Atmel AVR´s with something more modern. ATM i am having a closer look at the XMC1100 famity, and bought the "XMC 100 Boot Kit" to try it out.
Using DAVE, all the simple "Hello world" projects with having some LEDs blink worked fine so far.
Now its time to see how easy it is to integrate that MCU into exiting designs. So, for the first step, i made a simple PCB with just the XMC1100T016F0064ABXUMA1 on it, a simple 5V regulator and the 8-pin 2.54mm debug header to connect the J-Link debugger to it.
I parted the boot kit board and the J-Link part, soldered the pin heads to both boards and made an 8-pin ribbon cable to connect them. After adding power to the boot kit board, i was able to program it - just fine.
So the Cable and my connectors are OK.
I have Pin 14 (XMC_CLK, P0.15) of the XMC go to Pin 1 (SWCLK) of the 8-pin header,
and Pin 15 (XMC_DATA, P0.14) to Pin 2 (SWD).
GND to Pins 4 and 5 of the debug header, +5V to Pins 3 and 6 of the debug header.
Pins 7 and 8 go to a jumper, so i can later see to where i will connect the UART. (no jumpers in place atm, so they are just floating)
Pin 5 of the XMC to GND, Pin 6 to +5V; 100nf bypass cap near the IC.
All other pins via 3.3K to a LED, so i could se later what happens there.
I can post the schematic, if that helps.

when i connect the debugger and try to download a program to the board, it keeps telling me:

SEGGER J-Link GDB Server V6.40 Command Line Version

JLinkARM.dll V6.40 (DLL compiled Oct 26 2018 15:06:02)

Command line: -if swd -device XMC1100-T016x0064 -endian little -speed 1000 -port 2331 -swoport 2332 -telnetport 2333 -vd -ir -localhostonly 1 -singlerun -strict -timeout 0
-----GDB Server start settings-----
GDBInit file: none
GDB Server Listening port: 2331
SWO raw output listening port: 2332
Terminal I/O port: 2333
Accept remote connection: localhost only
Generate logfile: off
Verify download: on
Init regs on start: on
Silent mode: off
Single run mode: on
Target connection timeout: 0 ms
------J-Link related settings------
J-Link Host interface: USB
J-Link script: none
J-Link settings file: none
------Target related settings------
Target device: XMC1100-T016x0064
Target interface: SWD
Target interface speed: 1000kHz
Target endian: little

Connecting to J-Link...
J-Link is connected.
Firmware: J-Link Lite-XMC4200 Rev.1 compiled Apr 5 2017 11:59:07
Hardware: V1.00
S/N: 591110263
Checking target voltage...
Target voltage: 3.30 V
Listening on TCP/IP port 2331
Connecting to target...ERROR: Could not connect to target.
Target connection failed. GDBServer will be closed...Restoring target state and closing J-Link connection...
Shutting down...



Of course the board has power, and the cable is connected correctly.
What am i doing wrong here ? Do i need to bootstrap some XMC pins to somewhere to activate the boot loader ?

Any help would be great - Thenk you !



P.S.: if i connect the boot kit board instead of my board, it conects fine, despite of having a diferent XMC chip on it

-Michael

Distance2Go Evaluate recorded data from RadarGUI

$
0
0
Hello,

I'm trying to evaluate data I recorded with the RadarGUI to get the same results as the RadarGUI shows me. I'm working with a Distance2Go Demo Board.
The D2G board detects two targets and displays both in the RadarGUI, so far so good. Then I used the record function to record the IQ data, the FFT data and the target info data.

Now I'm trying to evaluate the recorded FFT data to get the same target info (range and level) as the RadarGUI shows me. However, I cannot get it to work (I'm using MATLAB).
The recored FFT data (can be easily pasted into MATLAB) is attached to this post. The FFT bin size is the sampling frequency divided by the FFT size which is 166666 Hz/1024=162.76 Hz. To get the desired info I should have peaks with a beatfrequency as follows:
1304.6 Hz, 2609.3 Hz. Therefore the results I should get are: target 1) level: 1341, range 260 cm and target 2) level: 882, range 139 cm. So the 1304 Hz are connected to target #2, and the 2609 Hz to target #1.

I am trying to find the FFT peaks with the following MATLAB function:
Code:

[lvl,idx]=findpeaks(fftdata,'MinPeakHeight',10);
, where 10 is the range threshold and fftdata is a vector that contains the data from the attached file.
The values in lvl correspond to the levels mentioned above. But there is a problem with the beatfrequency. I calculate it by
Code:

fbeat=idx.*binsize;
. By doing so, I don't end up with the frequencies mentioned above. What am I doing wrong?

Another problem is that I get 11 instead of 2 results by doing that...

I appreciate your help!
?????

Upgrading GCC in DAVE4

$
0
0
WARNING FOR ALL DAVE USERS:
The GCC version 4.9.3 supplied with DAVE 4.4.2 suffers from optimizer bug that can drop null-pointer checks from your code. I just found in my own code that the bug activates on optimization levels O2, O3 and Os. Description of the bug can be found here:
https://stackoverflow.com/questions/...-check-removal

The same issue is still present at least in GCC 5.3.0, but it is not clear for me when or if this has been fixed in any official release:
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71867

External Line IRQ

$
0
0
I use
.edge_detection = XMC_ERU_ETL_EDGE_DETECTION_BOTH.
for exertnal IRQ


This trigger my IRQ function

void ERU0_0_IRQHandler(void) {

if "falling edge" {

}
else {

}

}

Now i have to find out the direction <falling edge> in the IRQ Function, is this possible ?


Thanks,
Embedded
Viewing all 9892 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>