Kampi

Members
  • Content Count

    19
  • Joined

  • Last visited

About Kampi

  • Rank
    Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Kampi

    XADC - AD7 sampled on AD14

    It seems that the inputs are disturbing each other. When I apply the input voltage to only one input all four LEDs change her brightness by increasing and decreasing the voltage. When I connect an input voltage to AD14 and AD7 I can change the brightness of LED M15 with AD7 and the brightness of all other three LEDs with AD14. I can manipulate the brightness of LED D18 when I connect AD6 with the input voltage and all other LEDs with AD7.
  2. Kampi

    XADC - AD7 sampled on AD14

    Hello @jpeyron I´ve copied the settings for the XADC and the software from your last project (June 24 2017) from this thread and modify the code to read Vaux15, Vaux 14 and Vaux7 ExtVolRawData = XSysMon_GetAdcData(SysMonInstPtr, XSM_CH_AUX_MAX); ExtVolData = XSysMon_RawToExtVoltage(ExtVolRawData); printf("The Current Vaux15 is %0d.%03d Volts. \r\n", (int)(ExtVolData), SysMonFractionToInt(ExtVolData)); ExtVolRawData = XSysMon_GetAdcData(SysMonInstPtr, XSM_CH_AUX_MIN + 7); ExtVolData = XSysMon_RawToExtVoltage(ExtVolRawData); printf("The Current Vaux7 is %0d.%03d Volts. \r\n", (int)(ExtVolData), SysMonFractionToInt(ExtVolData)); ExtVolRawData = XSysMon_GetAdcData(SysMonInstPtr, XSM_CH_AUX_MAX - 1); ExtVolData = XSysMon_RawToExtVoltage(ExtVolRawData); printf("The Current Vaux14 is %0d.%03d Volts. \r\n", (int)(ExtVolData), SysMonFractionToInt(ExtVolData)); usleep(500000); But the problem still exist. And the issue exist in the HDL XADC project too. The brightness of the LEDs M14 and M15 changes when I modify the voltage on pin AD7 of the Zybo. By the way... The brightness of LED D18 and G14 changes with the voltage on pin AD15.
  3. Kampi

    XADC - AD7 sampled on AD14

    A small update: It seems that the channel sequencer has something to do with this behavior. This problem exist when I use XADCPS_SEQ_MODE_CONTINPASS and XADCPS_SEQ_MODE_ONEPASS, but disapears when I use XADCPS_SEQ_MODE_SIMUL_SAMPLING.
  4. Kampi

    XADC - AD7 sampled on AD14

    Hello, I have a Zybo Board (Version 1 Rev. and I have a strange Issue with my XADC which samples the input for the channel AD7 on the channel AD14. Please take a look at my setup. I want to use the differential Channelpair 7 and 15 (like in the Photo - upper row VIn and lower row ground from my voltage source). My software gives me the results for channel 14 and 15 and the value for channel 7 stays constant even when I increase or decrease the input voltage. Only channel 14 and 15 change her values. I expect that channel 14 stays constant and channel 7 change his value. Temperature: 46.1576 Degree Celsius Vcc INT: 0.9961 V Vref+: 1.25 V Vref-: 3.00 V Channel 7: 2351 Channel 14: 26032 Channel 15: 32767 My code looks like this #include "stdio.h" #include "xparameters.h" #include "xadcps.h" XAdcPs XAdc; XAdcPs_Config* ConfigPtr; int main() { ConfigPtr = XAdcPs_LookupConfig(XPAR_XADC_DEVICE_ID); if(ConfigPtr == NULL) { xil_printf("Invalid XADC configuration!"); return XST_FAILURE; } XAdcPs_CfgInitialize(&XAdc, ConfigPtr, ConfigPtr->BaseAddress); if(XAdcPs_SelfTest(&XAdc) != XST_SUCCESS) { xil_printf("Self test failed!"); return XST_FAILURE; } XAdcPs_Reset(&XAdc); XAdcPs_SetSeqChEnables(&XAdc, XADCPS_SEQ_CH_AUX07 | XADCPS_SEQ_CH_AUX14 | XADCPS_SEQ_CH_AUX15); XAdcPs_SetSequencerMode(&XAdc, XADCPS_SEQ_MODE_CONTINPASS); xil_printf("Start...\n\r"); while(1) { u32 Temp = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_TEMP); printf("Temperature: %.4f Degree Celsius\n\r", XAdcPs_RawToTemperature(Temp)); u32 VCCInt = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_VCCINT); printf("Vcc INT: %.4f V\n\r", XAdcPs_RawToVoltage(VCCInt)); u32 VREFp = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_VREFP); printf("Vref+: %.2f V\n\r", XAdcPs_RawToVoltage(VREFp)); u32 VREFn = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_VREFN); printf("Vref-: %.2f V\n\r", XAdcPs_RawToVoltage(VREFn)); u32 Ch7 = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_AUX_MIN + 7); xil_printf("Channel 7: %lu\n\r", Ch7); u32 Ch14 = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_AUX_MAX - 1); xil_printf("Channel 14: %lu\n\r", Ch14); u32 Ch15 = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_AUX_MAX); xil_printf("Channel 15: %lu\n\r", Ch15); xil_printf("-------------\n\r"); for(u32 i = 0x00; i < 0xFFFFFF; i++); } return XST_SUCCESS; } With the following XDC file: ##Pmod Header JA (XADC) set_property IOSTANDARD LVCMOS33 [get_ports Vaux14_v_n] set_property IOSTANDARD LVCMOS33 [get_ports Vaux14_v_p] set_property PACKAGE_PIN N16 [get_ports Vaux14_v_n] set_property IOSTANDARD LVCMOS33 [get_ports Vaux6_v_n] set_property IOSTANDARD LVCMOS33 [get_ports Vaux6_v_p] set_property IOSTANDARD LVCMOS33 [get_ports Vaux7_v_n] set_property IOSTANDARD LVCMOS33 [get_ports Vaux7_v_p] set_property IOSTANDARD LVCMOS33 [get_ports Vaux15_v_n] set_property IOSTANDARD LVCMOS33 [get_ports Vaux15_v_p] So what is going wrong here?
  5. Hello, I got the video example running. You can find the first version here: https://gitlab.com/Kampi/Zybo/tree/master/Examples/Video
  6. Hey thank you for your answer. But I think taking a look at the HDMI out demo should be the better way. What do you think?
  7. Hello, I try to generate a VGA signal with a VDMA, Video Timing, and AXI4 Stream to Video Out IP for my Zybo. So I create the following block design with the given settings (Note: I test the design with the test pattern generator instead of the VDMA before, so I know that the settings of the Timing Generator and Video Out IP are correct). My code looks like this: #ifdef WITH_TESTPATTERN #include "xv_tpg.h" #endif #include "xaxivdma.h" #include "xparameters.h" #ifdef WITH_TESTPATTERN XV_tpg TPG; XV_tpg_Config* TPG_Config; #endif XAxiVdma_Config* VDMA_Config; XAxiVdma VDMA; XAxiVdma_DmaSetup ReadConfiguration; unsigned int frame_buffer[800][600][3]; unsigned int srcBuffer; u32 Status; void fill(void) { for(u32 i = 0x00; i < 800; i++) { for(u32 j = 0x00; j < 600; j++) { frame_buffer[i][j][0] = 0xFF; frame_buffer[i][j][1] = 0xFF; frame_buffer[i][j][2] = 0xFF; } } } int main() { #ifdef WITH_TESTPATTERN TPG_Config = XV_tpg_LookupConfig(XPAR_TESTPATTERN_DEVICE_ID); if(!TPG_Config) { xil_printf("Error during test pattern generator configuration!\n\r"); return -1; } Status = XV_tpg_CfgInitialize(&TPG, TPG_Config, TPG_Config->BaseAddress); if(Status != XST_SUCCESS) { xil_printf("Error during test pattern generator initialization!\n\r"); return -1; } XV_tpg_Set_height(&TPG, 600); XV_tpg_Set_width(&TPG, 800); XV_tpg_Set_bckgndId(&TPG, 0x0C); XV_tpg_EnableAutoRestart(&TPG); XV_tpg_Start(&TPG); #endif VDMA_Config = XAxiVdma_LookupConfig(XPAR_VIDEODMA_DEVICE_ID); if(!VDMA_Config) { xil_printf("Error during VDMA configuration!\n\r"); return -1; } Status = XAxiVdma_CfgInitialize(&VDMA, VDMA_Config, VDMA_Config->BaseAddress); if(Status != XST_SUCCESS) { xil_printf("Error during VDMA initialization!\n\r"); return -1; } ReadConfiguration.VertSizeInput = 600; ReadConfiguration.HoriSizeInput = 800 * (VDMA_Config->Mm2SStreamWidth >> 3); ReadConfiguration.Stride = 800 * (VDMA_Config->Mm2SStreamWidth >> 3); ReadConfiguration.FrameDelay = 0; ReadConfiguration.EnableCircularBuf = 1; ReadConfiguration.EnableSync = 0; ReadConfiguration.PointNum = 0; ReadConfiguration.EnableFrameCounter = 0; ReadConfiguration.FixedFrameStoreAddr = 0; Status = XAxiVdma_DmaConfig(&VDMA, XAXIVDMA_READ, &ReadConfiguration); if(Status != XST_SUCCESS) { xil_printf("Read channel configuration failed!\n\r"); return -1; } fill(); Status = XAxiVdma_DmaSetBufferAddr(&VDMA, XAXIVDMA_READ, (UINTPTR*)frame_buffer); if(Status != XST_SUCCESS) { xil_printf("Read channel set buffer address failed!\n\r"); return -1; } Status = XAxiVdma_DmaStart(&VDMA, XAXIVDMA_READ); if(Status != XST_SUCCESS) { xil_printf("Failed to start DMA engine (read channel)!\n\r"); return -1; } xil_printf("Start...\n\r"); while(1) { } return 0; } But the monitor doesn´t show the picture (and no message that the signal is missing, so HSync and VSync work) and I´ve got the terminal message Read channel set buffer address failed! So what is wrong with the code? Thank you guys
  8. Sorry for the late answer. I got my ethernet running now. I think my Yocto image was the key of the problem. With the Xilinx ramdisk example, the ethernet works well, so I create a new Yocto image today and now my ethernet works very with my new SD-Card image. Thank you for help. If someone else needs a working image for the Zybo, please check my git project https://github.com/Kampi/Zybo-Linux There you will find all the necessary files.
  9. Hello, could this problem have something to do with missing ip tables?
  10. Hello, yes I think I have to test it with wireshark (will do it this week). The connection between Switch and PC works, because I have my raspberry pi connected with that switch and I can ping it (maybe I use an uplink port of that switch? Just an idea, i,ve got in this moment).
  11. Hello, the settings for my Zybo: root@linaro-developer:~# ifconfig eth0 Link encap:Ethernet HWaddr 00:0a:35:00:01:22 inet addr:192.168.178.160 Bcast:192.168.178.255 Mask:255.255.255.0 UP BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Interrupt:146 Base address:0xb000 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:65536 Metric:1 RX packets:3054 errors:0 dropped:0 overruns:0 frame:0 TX packets:3054 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1 RX bytes:290010 (283.2 KiB) TX bytes:290010 (283.2 KiB) root@linaro-developer:~# Please see the screenshots for my settings:
  12. Hello, thank you. The ethernet interface is working now, but I can´t ping the zybo from my PC. IP-address etc. should be ok, because I can ping another device if I plug the LAN-cable from the zybo into another device. Is there anything else which can do "problems" and block my ping?
  13. Great. This solution works. Now I have an ethernet device. But why is this working?
  14. Hello, I connect MDIO to MIO 52 and 53 now. IP link doesn´t show any interface root@linaro-developer:~# ip link show 1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2: sit0@NONE: <NOARP> mtu 1480 qdisc noop state DOWN mode DEFAULT group default qlen 1 link/sit 0.0.0.0 brd 0.0.0.0 root@linaro-developer:~# And I get this messages: Failed to start Raise network interfaces. See 'systemctl status networking.service' for details. [ OK ] Reached target Network. root@linaro-developer:~# systemctl status networking.service ● networking.service - Raise network interfaces Loaded: loaded (/lib/systemd/system/networking.service; enabled; vendor prese Drop-In: /run/systemd/generator/networking.service.d └─50-insserv.conf-$network.conf Active: failed (Result: exit-code) since Sat 2016-05-21 22:31:35 UTC; 1min 28 Docs: man:interfaces(5) Main PID: 1787 (code=exited, status=1/FAILURE) May 21 22:31:34 linaro-developer ifup[1787]: Cannot find device "eth0" May 21 22:31:34 linaro-developer ifup[1787]: Failed to bring up eth0. May 21 22:31:34 linaro-developer ntpdate[1907]: Can't find host 0.debian.pool.nt May 21 22:31:34 linaro-developer ntpdate[1907]: Can't find host 1.debian.pool.nt May 21 22:31:34 linaro-developer ntpdate[1907]: Can't find host 2.debian.pool.nt May 21 22:31:34 linaro-developer ntpdate[1907]: Can't find host 3.debian.pool.nt May 21 22:31:34 linaro-developer ntpdate[1907]: no servers can be used, exiting May 21 22:31:35 linaro-developer systemd[1]: Failed to start Raise network inter May 21 22:31:35 linaro-developer systemd[1]: networking.service: Failed with res lines 1-17/17 (END) root@linaro-developer:~# dmesg | grep ethernet [ 0.808309] macb e000b000.ethernet: failed to get macb_clk (4294967294) [ 0.813579] macb: probe of e000b000.ethernet failed with error -2
  15. Hello, thank you for your answer. Please take a look at "Ethernet.png" for my ethernet settings. I use the same as in this thread. But I don´t have any ethernet interface when I use the "ip link show" command after booting linux. root@linaro-developer:~# ip link show 1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2: sit0@NONE: <NOARP> mtu 1480 qdisc noop state DOWN mode DEFAULT group default qlen 1 link/sit 0.0.0.0 brd 0.0.0.0 root@linaro-developer:~# Should I make that MDIO-Port external?